What are parameters?
Parameters are the settings which define AI models

Parameters are the 'settings' which define the modern-day AI model.
In their raw form they are just numbers, because AI at its heart is just math.
The math is layered on top of more math, which is layered on top of extremely fast computer systems to deliver a baseline product of a neural network.
These neural networks – the basic brain of an AI system - are then tweaked in various ways to produce different products and outcomes.
So for example, one version will make a GPT or large language model. Another version will deliver a diffusion model for creating AI images, and so on.
As with everything to do with AI, this is a very simplistic way of describing the structure, but it provides a ballpark way of understanding something which is immensely complex.
How do parameters work?
Parameters determine how AI models work. The intense math and algorithms that make up parameters in effect deliver the ‘soul of the machine’.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Generally speaking, the more parameters that are used in an AI model, the more powerful the model is.
This is because the parameters are the variables which determine how the model reacts to inputs, and what its outputs are.
Let’s take a specific example to make things easier to digest.
When you sit down in front of an AI chatbot and enter in the words ‘what is the capital of Turkey’, those letters and words are converted into numbers by your computer and uploaded to the AI model’s neural network.
This can either be on your own computer or up in the cloud.
The model then does an incredibly intricate set of calculations at impossible speeds inside the computer’s processing chips, and uses the parameters it’s been trained with to deliver the answer you’re looking for.
These parameters include weight which define relative importance, biases such as the amount of western cultural training, filters and a myriad of other different components which shape and determine the answer it provides to you.
The word ‘Ankara’ looks like an easy answer, but it’s the net result of maybe billions of AI calculations.
One simple example. The model has to calculate the meaning of the words, the context, and the most likely language response to each specific question. So for instance, is Turkey a country or a food?
All of these computational transactions take place in nanoseconds, and we see the results almost instantaneously.
The all-important parameters are added to the model in training, and once added they are set for that iteration of model, until new training or fine-tuning is subsequently done on the model.
The importance of parameters in the real world
Of course parameters go way beyond AI.
The world is composed of a huge set of parameters, factors such as mass, velocity, size, length, these and much much more.
These are settings we use every day to define and manipulate our physical world. And just like in the physical world, parameters are not the only way which defines how AI interacts with the world.
The quality of the training data, the architecture of the model and even the raw computing power applied to the neural network all have an effect on the outcomes.
The recent arrival of the Deepseek model points to this stark fact.
This is a model which was trained using innovative techniques on modest hardware , and yet managed to produce results which were equal or better to huge established AI models from companies like OpenAI.
This has caused a huge rethink in the importance of model architecture for achieving real world performance improvements. It’s no longer a foregone conclusion that those with the biggest computer or the largest number of parameters will win.
We almost certainly haven’t seen the full extent of the power of training parameters on AI models and AI in general, but the arrival of techniques such as reasoning and thinking may have subtly altered the weighting given to the initial training process.
And once AI starts to learn how to improve on its default parameters, there’s likely to be an exponential increase in the power and utility of these amazing components.

Nigel Powell is an author, columnist, and consultant with over 30 years of experience in the tech industry. He produced the weekly Don't Panic technology column in the Sunday Times newspaper for 16 years and is the author of the Sunday Times book of Computer Answers, published by Harper Collins. He has been a technology pundit on Sky Television's Global Village program and a regular contributor to BBC Radio Five's Men's Hour. He's an expert in all things software, security, privacy, mobile, AI, and tech innovation.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.