<p>TAIPEI, March 10 (Reuters) - Taiwan’s Foxconn said on Monday it has launched its first large language model and plans to use the technology to improve manufacturing and supply chain management.</p>.<p>The model, named “FoxBrain,” was trained using 120 of Nvidia’s H100 GPUs and completed in about four weeks, the world's largest contract electronics manufacturer said in a statement.</p>.<p>The company, which assembles iPhones for Apple and also produces Nvidia's artificial intelligence servers, said the model is based on Meta’s Llama 3.1 architecture.</p>.<p>It is Taiwan's first large language model with reasoning capabilities that is optimised for traditional Chinese and Taiwanese language styles, it said.</p>.How To hedge against AI stealing your job.<p>Foxconn said that though there was a slight performance gap compared with China's DeepSeek's distillation model, its overall performance is very close to world-class standards.</p>.<p>Initially designed for internal applications, FoxBrain covers data analysis, decision support, document collaboration, mathematics, reasoning and problem-solving, and code generation.</p>.<p>Foxconn said it plans to collaborate with technology partners to expand the model’s applications, share its open-source information, and promote AI in manufacturing, supply chain management, and intelligent decision-making.</p>.<p>Nvidia provided support through its Taiwan-based supercomputer “Taipei-1” and offered technical consulting during the model’s training, Foxconn said.</p>.<p>Taipei-1, the largest supercomputer in Taiwan, is owned and operated by Nvidia in Kaohsiung, a southern city on the island.</p>.<p>Foxconn will announce further details about the model during Nvidia’s GTC developer conference in mid-March.</p>
<p>TAIPEI, March 10 (Reuters) - Taiwan’s Foxconn said on Monday it has launched its first large language model and plans to use the technology to improve manufacturing and supply chain management.</p>.<p>The model, named “FoxBrain,” was trained using 120 of Nvidia’s H100 GPUs and completed in about four weeks, the world's largest contract electronics manufacturer said in a statement.</p>.<p>The company, which assembles iPhones for Apple and also produces Nvidia's artificial intelligence servers, said the model is based on Meta’s Llama 3.1 architecture.</p>.<p>It is Taiwan's first large language model with reasoning capabilities that is optimised for traditional Chinese and Taiwanese language styles, it said.</p>.How To hedge against AI stealing your job.<p>Foxconn said that though there was a slight performance gap compared with China's DeepSeek's distillation model, its overall performance is very close to world-class standards.</p>.<p>Initially designed for internal applications, FoxBrain covers data analysis, decision support, document collaboration, mathematics, reasoning and problem-solving, and code generation.</p>.<p>Foxconn said it plans to collaborate with technology partners to expand the model’s applications, share its open-source information, and promote AI in manufacturing, supply chain management, and intelligent decision-making.</p>.<p>Nvidia provided support through its Taiwan-based supercomputer “Taipei-1” and offered technical consulting during the model’s training, Foxconn said.</p>.<p>Taipei-1, the largest supercomputer in Taiwan, is owned and operated by Nvidia in Kaohsiung, a southern city on the island.</p>.<p>Foxconn will announce further details about the model during Nvidia’s GTC developer conference in mid-March.</p>