Submitted by IluvBsissa t3_11e7csf in singularity
JVM_ t1_jad8flc wrote
I think it's much less.
Give me a piece of graph paper, laid out like the game Battleship.
Now, you want me to draw all the roads around your house. You could tell me to draw a road that goes through A1, A2, A3, A4, A5. But what if your road goes all the way to 100, you'd quickly switch to "Draw a road from A1 to A100"
I think this is where you can cut corners with the code generation as well.
"Make me a person object with a name and age. They can have friends who are also people" "Store this in a database that has scaling and load balancing based on X parameters"
I think the number of tokens required to generate software are much lower than you'd expect - but having the LLM understand the previous context and tailor it's response to what was previously generated would need to change from what we see today.
Viewing a single comment thread. View all comments