Anthropic Reportedly Plans to Raise $25 Billion at $350 Billion Valuation
FundingLarge ModelsAI Infrastructure
According to TechCrunch citing the Financial Times, Anthropic is advancing a funding round of approximately $25 billion, with a valuation of around $350 billion—significantly higher than its previous valuation of about $183 billion. GIC and Coatue are reportedly leading the round with $1.5 billion each, and Sequoia will also participate. The report adds that Microsoft and NVIDIA have collectively committed up to $15 billion, with funds to be used for expanding computing infrastructure and accelerating enterprise sales.
xAI Says 1GW Colossus 2 Online for Training Grok 5
AI ComputeData CenterxAI
Media reports indicate xAI's gigawatt-scale supercomputing cluster Colossus 2 has come online, delivering approximately 1GW of power, with plans to scale to 1.5GW by April. The cluster is said to include around 550,000 GPUs for training Grok 5, which may reach up to 6 trillion parameters. xAI has deployed 168 Megapacks in Memphis for energy storage buffering, though the massive power consumption and community impact continue to spark controversy.
China Telecom Open-Sources TeleChat3-105B with 4.7B Active Parameters
Open Source ModelMoEDomestic Compute
China Telecom's TeleAI has open-sourced TeleChat3-105B-A4.7-Thinking: a fine-grained MoE model with 105 billion total parameters and approximately 4.7 billion active parameters per token, consisting of one shared expert and 192 routed experts (with four activated per token). The team claims over 15 trillion tokens of base training data, optimized MoE communication and long-sequence training on a domestically built 10,000-GPU cluster in Shanghai Lingang. The model supports code, math, and Agent tasks, with weights and inference examples already released.
Fudan × National Meteorological Launch Financial Weather Model 'Shangji'
Industry ApplicationWeather AIFinTech
Fudan University and the National Meteorological Information Center have jointly launched Shangji, an AI financial weather model designed to incorporate meteorological factors into asset pricing and risk management. The model aims to help publicly listed companies in weather-sensitive industries manage climate risks, support banks and insurers in risk control for equity pledge financing, promote new models like climate investment and financing, provide quantitative tools for investors, and offer academic researchers a data and methodological framework to test asset pricing theories.
Hugging Face Releases FineTranslations with 1 Trillion Tokens
DatasetMultilingualHuggingFace
Hugging Face has released FineTranslations, a parallel corpus containing approximately 1 trillion tokens, covering English aligned with over 500 languages. The dataset was constructed by translating non-English content from FineWeb2 using Gemma3 27B, and includes publicly reproducible data generation pipelines via Datatrove. InfoQ notes it can enhance multilingual machine translation, especially for low-resource language pairs, while also serving as enriched monolingual pretraining data preserving cultural context.
Vercel Open-Sources agent-skills: Skill Packages for Coding Agents
Developer ToolsOpen SourceAI Coding
Vercel has open-sourced agent-skills, a package-manager-like 'skill library' for AI coding agents, organizing reusable rules under the Agent Skills specification. Initial packages include react-best-practices (40+ performance rules), web-design-guidelines (100+ UI/UX rules), and vercel-deploy-claimable (one-click deployment generating claimable preview links). It supports installation via npx skills i and npx add-skill, and is automatically discoverable by tools like Claude Code and Cursor, enabling natural language-triggered code review and improvements.
Redis Creator Open-Sources flux2.c: Pure C Inference for FLUX.2-klein
Open SourceInference OptimizationAIGC
antirez, creator of Redis, has open-sourced flux2.c: a pure C implementation for FLUX.2-klein-4B inference, reducing reliance on Python/PyTorch/CUDA, supporting text-to-image and image-to-image generation. The project includes the Qwen3-4B text encoder, directly loads safetensors weights (~16GB), and supports Metal and BLAS acceleration. Releasing the encoder after initialization saves ~8GB of memory. The author notes current speed is slower than PyTorch implementations but enables easier integration into embedded or native applications.
China's first AI companion chat obscenity case has entered its second trial: developers and operators of an emotional companion app were sentenced to four years and one year and six months in prison respectively, after AI interactions generated large volumes of obscene content. Reports reveal around 116,000 registered users and 24,000 paying users, with total recharge revenue exceeding 3.63 million yuan. Among a sample of 150 paying users, 141 engaged in chats containing obscene content. The court ruled that the operators actively enabled and promoted 'dirty talk' features through prompt engineering and marketing, playing a decisive role in illegal content generation, thus bearing criminal liability.