By John Gruber
Manage GRC Faster with Drata’s Agentic Trust Management Platform
Alex Stamos — former head of security at Facebook, currently at Stanford Internet Observatory, in a thread on Twitter:
In my opinion, there are no easy answers here. I find myself constantly torn between wanting everybody to have access to cryptographic privacy and the reality of the scale and depth of harm that has been enabled by modern comms technologies.
Nuanced opinions are OK on this.
Good thread with much to consider.
Apple:
Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.
Can non-CSAM images be “injected” into the system to flag ac- counts for things other than CSAM?
Our process is designed to prevent that from happening. The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of known CSAM image hashes. The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design. Finally, there is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. In the unlikely event of the system flagging images that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.
This FAQ is good, and addresses most of the misconceptions I’ve seen. The human review step for flagged accounts is key to the trustworthiness of the system.
I do wonder though, how prepared Apple is for manually reviewing a potentially staggering number of accounts being correctly flagged. Because Apple doesn’t examine the contents of iCloud Photo Library (or local on-device libraries), I don’t think anyone knows how prevalent CSAM is on iCloud Photos. We know Facebook reported 20 million instances of CSAM to NCMEC last year, and Google reported 546,000. For Facebook, that’s about 55,000 per day; for Google, 1,500 per day. I think it’s a genuine “we’ll soon find out” mystery how many iCloud Photo users are going to be accurately flagged for exceeding the threshold for CSAM matches when this goes live. If the number is large, it seems like one innocent needle in a veritable haystack of actual CSAM collections might be harder for Apple’s human reviewers to notice.
My thanks to Tara for sponsoring last week at DF. Tara helps developers make great software as quickly as possible. Here are three blockers they hear often from developers:
Getting everyone aligned — Get ideas across clearly with a clear problem statement and requirements. Use a tool like Tara to get sign-off before you start.
Visibility into actual progress — Code changes are the best indicators of progress. Use tools that enable transparency. With Tara, everyone can see commits, blocks, and merges for a sense of true progress.
Manual status updates — Manual updates are the achilles heel of every project. Use tools that automate tedious, low-value actions — like Tara’s auto-status that marks tasks as done when a PR merges.
One workspace for your team’s docs, sprints, and tasks synced to code. Plus an API for custom workflows. Get started on Tara for free.
To me, version 14 feels like the biggest major release of BBEdit in many years. For programmers, there’s a major new feature: Language Server Protocol (LSP) support, that, basically, adds a slew of IDE-style functionality for code completion, refactoring, and linting. There are also new language modules for R, Go, Lisp, and Rust.
For everyone, there’s a new “notes” feature. From the release notes:
You know that thing where you have a whole bunch of untitled documents open, because it’s so easy to make one and type some notes, and then just leave it open? And you rely on BBEdit’s amazing crash recovery and document restoration to not lose your carefully kept notes? You can keep doing that if you want, but we have a new feature to make the whole thing faster and easier: Notes.
Notes are mostly like ordinary text documents, except that you don’t have to remember to save them or even make up a name if you don’t want to. BBEdit keeps notes all together in a “notebook”. Notes exist on disk as text files; there’s no secret file format involved. […]
There are many ways to make a note, so you can use whatever fits your workflow and style.
Notes default to Markdown, but you can change that to whatever you want (of course).
See also:
Jason Snell, who wrote a nifty AppleScript that lets him drag an image into a Markdown file in BBEdit 14, and have it (a) upload that file to his server in the background, and (b) insert the Markdown syntax to reference that just-uploaded file.
Watts Martin, who has a good overview of where BBEdit 14 stands compared to several popular competing programming text editors.