Viewing a single comment thread. View all comments

Forstmannsen t1_jdxy37q wrote

TBH the question from the tweet is relevant. LLMs provide statistically likely outputs to inputs. Is an unknown scientific principle statistically likely output to description of the phenomena? Rather tricky question if you ask me.

Honestly, so far I see LLMs as more and more efficient bullshit generators. Will they automate many humans out of work? Sure, production of bullshit is a huge industry. Are we ready for the mass deluge of algorithmically generated bullshit indistinguishable from human generated bullshit? No we aren't. We'll get it anyway.

1