14 Comments
User's avatar
Javier Jurado's avatar

The debate around the neutrality of technology always finds analogies that both sides use to defend their position: a knife can be used both to cut and share bread or to kill someone, but it’s hard to imagine a positive use for a Kalashnikov. What is AI more similar to?

Expand full comment
Alberto Romero's avatar

A knife 100%.

Expand full comment
Logan Thorneloe's avatar

Good overview. The largest complaint I had heard (before amendments which I haven't taken a look at) was that it would stifle open-source innovation because open-source LLMs from large companies and their derivatives are impacted by the bill. This causes the bill to impact smaller companies using those open-source models more than it would large companies who can deal with the regulation.

Expand full comment
Alberto Romero's avatar

Yeah, I've heard the same (e.g. Meta's Llama). I'm not sure how that would work (mostly because of my lack of knowledge of how the law applies in specific instances) but as far as I know is the original developer that's liable and not the user as long as the fine-tuning isn't $10M, no?

Expand full comment
sean pan's avatar

Correct, it has been amended now to mostly exclude open weights

Expand full comment
L A Chambers's avatar

This was a very helpful overview, thank you. I feel undecided on an opinion (not that it matters really).

Expand full comment
Sheila Dean's avatar

"10^26" et al might/may be written as an exoflop.

Expand full comment
sean pan's avatar

Honestly most of the complaints about the bill was before it was amended. At thus point, it only asks for "reasonable consideration" and is just a transparency bill that hits the large companies

Feifei has associations with Big Tech companies; most independent AI academia and tech workers support it.

Expand full comment
Alberto Romero's avatar

As far as I know Google said concerns are still valid with the latest amendments and Meta and OpenAI will likely not change their stance. Ng, Li, and LeCun argue against regulating the technology itself and the bill still regulates at that level. I understand there's associations there but also Hinton and Bengio have associations and they support it. It's a complex debate for sure, not so easily dismissed.

Expand full comment
sean pan's avatar

Hinton and Bengio are far more independent, while Feifei is directly being funded by pmarc.

At any rate, this only asks for reasonable care. The security plan, as Zvi noted, could be as simple as:

"We all have to laugh before we release a model."

https://open.substack.com/pub/thezvi/p/guide-to-sb-1047

Expand full comment
Eugene Bordelon's avatar

You agreed with someone who asks if AI is like “a knife can be used both to cut and share bread or to kill someone”.

No - it is more like “nuclear technology that can be used to help mankind or destroy us”.

Actually, looking into the future (perhaps not so distant), if AI becomes AGSI (Artificial General Super Intelligence) it will supersede our species. The question then is what will it do with us - destroy us or put us in a zoo?



In either case, regulation from the point of view of us humans is a MUST!


Expand full comment
Alberto Romero's avatar

Right now it's more like a knife.

Expand full comment
Eugene Bordelon's avatar

This is the best time to start developing regulations while it is just a knife.

Expand full comment
Marc hill's avatar

What about AI developed by ETs?

Expand full comment