The final version of rules the regulator says will offer children in the UK "transformational new protections" online have been published.
Sites will have to change the algorithms that recommend content to young people and introduce beefed-up age checks by 25 July or face big fines.
Platforms which host pornography, or offer content which encourages self-harm, suicide or eating disorders are among those which must take more robust action to prevent children accessing their content.
Ofcom boss Dame Melanie Dawes said it was a "gamechanger" but critics say the restrictions do not go far enough and were "a bitter pill to swallow".
Ian Russell, chairman of the Molly Rose Foundation, which was set up in memory of his daughter - who took her own life aged 14 - said he was "dismayed by the lack of ambition" in the codes.
But Dame Melanie told BBC Radio 4's Today programme that age checks were a first step as "unless you know where children are, you can't give them a different experience to adults.
"There is never anything on the internet or in real life that is fool proof… [but] this represents a gamechanger."
She admitted that while she was "under no illusions" that some companies "simply either don't get it or don't want to", but emphasised the Codes had legal force.
"If they want to serve the British public and if they want the privilege in particular in offering their services to under 18s, then they are going to need to change the way those services operate."
Prof Victoria Baines, a former safety officer at Facebook told the BBC it is "a step in the right direction".
Talking to the Today Programme, she said: "Big tech companies are really getting to grips with it , so they are putting money behind it, and more importantly they're putting people behind it."
The new rules for platforms are subject to parliamentary approval under the Online Safety Act.
The regulator says they contain more than 40 practical measures tech firms must take, including:
- Algorithms being adjusted to filter out harmful content from children's feeds
- Robust age checks for people accessing age-restricted content
- Taking quick action when harmful content is identified
- Making terms of service easy for children to understand
- Giving children the option to decline invitations to group chats which may include harmful content
- Providing support to children who come across harmful content
- A "named person accountable for children's safety"
- Management of risk to children reviewed annually by a senior body
If companies fail to abide by the regulations, Ofcom said it has "the power to impose fines and - in very serious cases - apply for a court order to prevent the site or app from being available in the UK."
Children's charity the NSPCC broadly welcomed the Codes, calling them "a pivotal moment for children's safety online."
But they called for Ofcom to go further, especially when it came to private messaging apps which are often encrypted - meaning platforms cannot see what is being sent.