Meta bets on AI models for mobile devices – Computerworld

[ad_1]

However, Meta researchers believe that effective SLMs with less than a billion parameters can be developed and it would unlock the adoption of generative AI across use cases involving mobile devices, which have relatively less compute infrastructure than a server or a rack.

The researchers, according to the paper, ran experiments with models, architected differently, having 125 million and 350 million parameters, and found that smaller models prioritizing depth over width enhance model performance.

“Contrary to prevailing belief emphasizing the pivotal role of data and parameter quantity in determining model quality, our investigation underscores the significance of model architecture for sub-billion scale LLMs,” the researchers wrote.

[ad_2]

Leave a Comment