norcalnatv OP t1_j84wt52 wrote
Reply to comment by norcalnatv in The Inference Cost Of Search Disruption – Large Language Model Cost Analysis [D] by norcalnatv
If the ChatGPT model were ham-fisted into Google’s existing search
businesses, the impact would be devastating. There would be a $36
Billion reduction in operating income. This is $36 Billion of LLM
inference costs.
Himalun t1_j8593ax wrote
It’s worth noting that both MS and Google own the data centers and hardware so it is likely cheaper for them to run. But still expensive.
Downchuck t1_j8500e1 wrote
Perhaps the number of unique queries is overstated: through vector similarity search and result caching, the vast majority of lookups would be duplicate searches already materialized. OpenAI has now introduced a "premium" option suggesting a market for premium search - suggesting room for more cash inflows. This may change their spend strategy, perhaps spending less on marketing and more on hardware.
Viewing a single comment thread. View all comments