LLama 3.3 70B
5
About
Advanced large language model with 70 billion parameters, optimized for instruction-following and rapid response.
Settings
Temperature- The temperature of the model. Higher values make the model more creative and lower values make it more focused.
Top P- undefined
Context length- The maximum number of tokens to use as input for a model.
Response length- The maximum number of tokens to generate in the output.