it is absolutely true that that AI keeps record of everything fed into it
No it isn't.
A properly trained deep learning system will ultimately far smaller than all of the data it's been trained on. It's simply impossible for it to have retained a record of very much of it at all.
When everything is working correctly it shouldn't have any of the actual text stored at all. Certainly every single piece of training data will have left some impression on the model, but that's a very long way from actually storing the training data. The model consists of statistical relationships, not a copy-paste of the inputs.
Strictly speaking there is something resembling text in the model, but it's made up of the smallest possible units of language (unless there's been overfitting, in which case the training has gone wrong and there probably would be a case to answer).
The model builds sentances from a list of "phrases" which don't even need to line up with word boundaries. Things like "is a" might be treated as a "word", as might "ing", if the model finds that to be a useful snippet.
What they're describing is a holographic lens, basically.
Holograms have the neat property that a holographic photograph of a lens will behave like a lens. You can skip a step and just imprint the idea of the lens into a holographic film directly without actually taking a photo, but the idea is the same.
The result is a sheet of something like glass or plastic, which can be as thin as you like, which behaves as if it were a huge glass lens which would have been so think you couldn't realistically use it.
It's a great idea, but very difficult to manufacture because you need to add something kind of similar to a fingerprint to the surface of your glass which is so small that the gap between ridges is smaller than the wavelength of light. That means they're only a couple of hundred nanometres apart (at most).
We can do that, but it's very difficult to do it correctly over the whole surface of a usefully large lens. I've known of people trying a decade ago, and they weren't getting anywhere fast