Comments

You must log in or register to comment.

4e_65_6f t1_irv74gd wrote

GPT-3 uses sequences to 'predict' what word comes next.

You could probably train it to predict the weather by training it with a database of sequences of weather events and it should output the most likely to happen next based on past reference.

This principle should in theory work for everything as long as your database accurately describes the events in an understandable sequence of text.

3

Thorusss t1_irv7qc8 wrote

Of course. Kind of naive question. It is one of their main uses. What will the user click next, what will the weather do, how will nuclear fusion behave, how will the stock market move, will the car in front of you brake, etc.

3

footurist t1_irwffnf wrote

If you're thinking going towards capabilities even remotely approaching Laplace's daemon ( even just for tiny chunks of the universe like the weather of city x ) then sadly ( or not ? ) that kind of assurance is way too computationally expensive and requires datasets no one can assemble.

However, a lot weaker variants may be possible, for that I don't know enough.

SPOILER

>!That said, in the tv show Devs they got it to work, lol.!<

1