Submitted by Kaarssteun t3_zz3lwt in singularity
SoylentRox t1_j2aehvs wrote
Reply to comment by YouGotNieds in OpenAI might have shot themselves in the foot with ChatGPT by Kaarssteun
I don't see how "lifetime access" makes any sense.
(1) Assuming it's to the current model and not future updates, that would be like buying a "lifetime copy" of MS-DOS internal beta 0.7 (whatever they called it back then), or an iphone 1 loaded with a pre-release copy of the OS.
It may work offline for your lifetime, but it's going to be useless compared to what's available within months.
(2) who's hosting it? GPT-3 is around 180 billion parameters, or 720 gigabytes of memory. This means the only thing capable of running it currently is a cluster of 8 Nvidia A100s with 80 Gb memory each, and each costs $25,000 and consumes 400 watts of power.
I'm not sure how "fast" it is, if you see chatGPT typing for 30 seconds are you drawing 3.2 kilowatts of power just for your session? I don't think it's that high, probably the delays are it's servicing other users.
Viewing a single comment thread. View all comments