Scrapii
anonymize-pii.bsky.social
Scrapii
@anonymize-pii.bsky.social
🇨🇦. entreprenerd.

Scrapii.net
You (and your data) are the product.

Redact and anonymize your data before your shadow profile fills in the blanks.

Scrapii.net
Increasingly, HIPAA Can’t Stop AI from De-Anonymizing Patient Data
Even after hospitals strip out names and zip codes, modern AI can sometime still work out who patients are. Great news for insurance companies; not so much for healthcare recipients.   New research fr...
www.unite.ai
February 13, 2026 at 7:18 AM
Reposted by Scrapii
The number of times I’ve helped implement HIPPA compliance initiatives when the client doesn’t even have a solid comprehension is astounding.
December 18, 2024 at 3:54 PM
Reposted by Scrapii
Talked with the dude who is going to bring AI into CVS’s minute clinic app.

Anyway. Don’t use the minute clinic app, the guy did not really seem all that concerned with HIPPA compliance.
February 3, 2026 at 7:34 PM
So I’m new to this #buildinpublic thing, but I’m working on an API to handle redaction and anonymization of PII in LLM pipelines. Needed it for #HIPPA compliance, and thought others might too.

Published a demo tool you can plug an API key into and try it out

GitHub.com/scrapii/scrapii-demo
GitHub - scrapii/scrapii-demo: An interactive demo showcasing the Scrapii.net PII redaction & tokenization API. Paste text or upload a document, watch it get tokenized in real time, then detokenize t...
An interactive demo showcasing the Scrapii.net PII redaction & tokenization API. Paste text or upload a document, watch it get tokenized in real time, then detokenize to restore the originals ...
GitHub.com
February 11, 2026 at 11:41 PM
Reposted by Scrapii
My bank account number is ██████

Oops! 🥴

Or how to [REDACT] Personally Identifiable Information (PII) before sending it to your #LLM or logging it!

(using #GoogleCloud Data Loss Prevention #API)

glaforge.dev/posts/2024/1...
Redacting sensitive information when using Generative AI models
As we are making our apps smarter with the help of Large Language Models, we must keep in mind that we are often dealing with potentially sensitive information coming from our users. In particular, in...
glaforge.dev
November 25, 2024 at 11:19 AM
Reposted by Scrapii
Do OpenAI really expect users to give up data that is classed as sensitive PII?

And should users willingly give this to a LLM platform?

Absolutely not.
January 8, 2026 at 4:16 PM