I was surpised to discover that this is a real license restriction, still in effect, and noted on NVIDIA's EULA for Geforce driver software on their site. The relevant part is in section 2.1.3 Limitations:
No Datacenter Deployment. The SOFTWARE is not licensed for datacenter deployment, except that blockchain processing in a datacenter is permitted.
What this means is that if you tried to set up a large scale service in a dedicated data centre, with multiple machines equipped with consumer-grade NVIDIA Geforce cards (including the RTX 3060), NVIDIA would not license your use of their driver software.
This restriction would not apply to training and research workstations. You can use those for any purpose, including AI. So you could kick-start a research team pursuing some neural network project using the relatively inexpensive gaming cards. Companies that specialise in building data science/AI workstations, like Lambda Labs, also use consumer GPU cards.
Cloud providers of GPU resource, like Google's GCP, do abide by this restriction, and only support the more expensive Tesla cards, such as the V100 and A100.
If your goal is construct your own data centre for AI to run a commercial system using cheaper consumer GPUs, then you are out of luck. However, outside of this, you could still use the same GPUs for research and development of that same service. Deploying the same models at scale woud require the more expensive hardware, but most AI frameworks such as TensorFlow or PyTorch will happily run the same models on different GPUs. So it appears to only be a problem - and an additional expense, not necessarily a show-stopper - at the point you are scaling up to data centre services.