Viewing a single comment thread. View all comments

TheTerrasque t1_j5irq12 wrote

On a side note, 7b isn't large these days.

GPT3 and BLOOMZ are around 175b parameters.

1

andreichiffa t1_j5ivwgc wrote

or OPT175.

However 7B is more than large enough to do a lot of shady stuff that 175B models can do. Even 1.5B ones are already starting to do a good job with a minimally competent user.

1