Three Funding Rounds Totaling Hundreds of Millions in One Year: This Company 'Revived by AI' Wants to Make AR Glasses as 'Ordinary as Regular Glasses'
Three Funding Rounds Totaling Hundreds of Millions in One Year: This Company "Revived by AI" Wants to Make AR Glasses as "Ordinary as Regular Glasses"

Friends, today I want to share something fascinating with you.
If you ask what was the most surreal storyline in the tech world of 2024? My vote goes to "AR Saved by AI."
Yes, that's right—the AR glasses category that Meta and Apple poured hundreds of billions of dollars into, hyped for years as the future of the metaverse, only to be rejected by users for being too heavy, too expensive, and too ugly—has suddenly been revitalized by AI large language models.
What makes this storyline even more interesting is that a company called "ZhiGe Technology" has completed three funding rounds totaling hundreds of millions over the past year, with hardcore investors including Yangtze Securities, Beijing New Materials Fund, and Dongkechuang Capital leading the rounds.
Even more notably, their CEO Meng Xiangfeng said something brutally honest: "Our industry was saved by AI."
Today, I'll break down what business logic and technological revolution lie behind this story of being "revived by AI."
From "Metaverse Fantasy" to "AI Tool": AR Finally Found Its Mission
Let me start with something counterintuitive.
What was the biggest pitfall of the AR industry over the past few years? I'll tell you: Ambition far exceeding capability.
The grand vision those giants painted for you went like this: Put on AR glasses, and the real world becomes your desktop—place virtual objects anywhere, game, work, socialize, all right before your eyes—this was the rhythm of replacing smartphones to become the next computing platform.
But what was the reality?
Glasses heavy as a brick, battery that doesn't last two hours, terrible display quality, and a price tag of tens of thousands.
Even worse, when you wear them walking down the street, passersby look at you like you're an alien creature from the future. This social pressure of "being watched" drove C-end users away completely.
But after AI arrived, the entire narrative changed.
Today's AI glasses no longer emphasize "the metaverse of virtual-real fusion" but have instead become "AI assistants with displays."
You don't need it to project an entire virtual world before your eyes—you only need it to:
- When you look up while navigating, use a small line of text to indicate direction
- When you're in a meeting, quietly display meeting notes in the corner of the lens
- When you're staring at a product, pop up a window telling you which store sells it cheaper
AI was searching for its most suitable hardware carrier, and now it's found glasses.
Why? Because smartphones require you to pull them out, unlock them, open apps—the entire process is too "heavy." Glasses sit right on your eyes—Always On, ready at any moment, this is the hardware form AI should have.
So you see, with Alibaba's Quark AI glasses, Meta's Ray-Ban smart glasses, the core logic of these products has changed: not to replace smartphones, but to become AI's "second screen."
Once this positioning changed, all technical metrics followed—lightweight, comfortable, good-looking became more important than field of view or resolution.
When AR Glasses Need to "Look Like Regular Glasses," Waveguides Became the Biggest Bottleneck
At this point, I need to give you some technical knowledge.
The core technology of AR glasses is called "waveguide"—simply put, it's projecting light from a tiny projector through a thin lens into your eyes.
How difficult is this technology? Let me give you an analogy:
Imagine you need to carve thousands of nanoscale grating structures on a piece of glass thinner than paper, making light transmit, turn, and output precisely inside it—this is like carving "Along the River During the Qingming Festival" on a grain of rice.
And over the past few years, what was the biggest pain point of waveguide technology? Four words: optical defects.
- Low transmittance: Feels like wearing sunglasses, the world looks gray
- Strong grating effect: Obvious patterns on the lens, looking like cheap protective film
- Rainbow patterns: Colorful halos appear on the lens when light hits it
- Light leakage: Light meant to project into the eye leaks from the lens edges
These problems directly led to one result: AR glasses, when worn, obviously don't look like regular glasses.
What ZhiGe Technology has done is eliminate all four defects.
How Did ZhiGe Win the "War of Annihilation" Against Four Major Optical Defects?
I carefully studied ZhiGe's technical approach and found they didn't do "patching" but redesigned from the ground up.
First breakthrough: Transmittance over 98%
What does this mean? Ordinary glass has about 91-92% transmittance, ZhiGe's waveguide reaches 98%, and non-grating areas can even achieve 99%.
In plain language: wearing these glasses feels almost no different from not wearing any.
Let me give you a comparison: early AR glasses had only about 40-60% transmittance, wearing them felt like viewing the world through dark sunglasses—who could tolerate that?
Second breakthrough: Grayscale gradient technology
The grating effect is an "inherent defect" of waveguide lenses because the surface has nanostructures that inevitably produce diffraction effects.
ZhiGe's approach: make grating density gradually sparser from center to edge, so the human eye during normal use (central vision area) hardly notices the grating's existence.
This technology sounds simple, but ZhiGe was the first to develop and mass-produce it—sometimes that's how first-mover advantage comes about.
Third breakthrough: Patented rainbow-free platform
The cause of rainbow patterns is light dispersion (different wavelengths have different refractive indices)—this is a physical law that can't be eliminated, but can be "tricked."
ZhiGe built a complete patented technology platform that, through grating structure design and optical material optimization, makes rainbow patterns "disappear" in 95% of usage scenarios.
Let me ask you: if you don't notice its existence 99% of the time, does it really exist for you?
Fourth breakthrough: Waveguide architecture design suppresses light leakage
The essence of light leakage is light "escaping" during transmission—ZhiGe's unique waveguide optical architecture design keeps light exactly where it should be.
These four technological breakthroughs combined produce one result: AR glasses can finally "camouflage" as regular glasses.
3 Grams Weight, 0.5mm Thickness—What Do These Numbers Mean?
When I saw these numbers, I paused for a moment:
- Waveguide lens weight: 3 grams
- Waveguide lens thickness: 0.5mm
Friends, do you know how heavy and thick regular glasses lenses are?
A pair of 1.60 refractive index myopia lenses weighs about 5-8 grams, with thickness generally between 1-2mm depending on prescription.
ZhiGe's waveguide lenses are lighter and thinner than regular lenses.
What do these numbers mean behind the scenes? It means the "wearing burden" of AR glasses has been minimized—wearing them feels almost no different from regular glasses.
And ZhiGe made another key innovation: a precision bonding solution for "concave" prescription lenses with flat waveguides.
Simply put: if you're nearsighted, you don't need to wear "clip-ons" anymore—the waveguide and prescription lens are made as one piece—this is the true "consumer-grade" experience.
Let me paint a scenario for you:
You leave home in the morning, putting on AR glasses that look exactly like regular glasses—same weight, thickness, comfort—but they can:
- Display arrows in the corner of the lens when navigating
- Pop up message previews when you receive WeChat messages
- Show real-time translation subtitles during meetings
- Quietly press the shutter when taking photos
This experience is what AI glasses should be.
Monthly Production of 250,000, Annual Production of 3 Million—What Signal Do These Numbers Send?
In the funding information, there's one piece of data I find particularly important:
- Monthly capacity: 250,000 units
- Annual capacity: 3 million units
- 2026 projected delivery: over 1 million units
Friends, do you know what this means?
This is the first company in the industry to announce annual delivery exceeding 1 million waveguide units.
Let me translate this for you: the AR industry has officially entered the "mass production phase" from the "technology verification phase."
The AR industry of the past was more like a "lab game"—technology was cool, but production capacity couldn't scale, costs couldn't come down, and it could only be sold to tech enthusiasts or B-end customers.
But when production capacity reaches the million-unit scale, the entire economic model changes:
- Marginal costs drop significantly: the higher the production, the lower the per-unit cost
- Supply chain bargaining power increases: million-level orders make upstream material suppliers rush to cooperate
- Yield optimization space opens up: the more you produce, the more mature the process, the higher the yield rate
This is like the new energy vehicle industry—when Tesla pulled Model 3 production to hundreds of thousands of units, the entire industry's cost curve was reshaped.
And in this process, ZhiGe made two key strategic moves:
1. Parallel nanoimprint and etching processes
- Nanoimprint: more cost-effective and suitable for mass production of small FOV (field of view) waveguides
- Etching process: reaches international leading levels in performance metrics like field of view, optical efficiency, and uniformity
This strategy is smart: volume for low-end, technology for high-end, walking on two legs.
2. First to mass-produce on 12-inch wafers
One 12-inch wafer can produce 15-20 waveguide units, doubling capacity compared to 8-inch wafers.
Let me do the math for you: if the monthly capacity target is 250,000 units, using 8-inch wafers requires more equipment, more workers, larger facilities; while using 12-inch wafers, the same investment produces more—this is the power of economies of scale.
Why ZhiGe? Why Now?
After discussing so many technical details, I want to analyze why it was precisely ZhiGe Technology that secured three funding rounds totaling hundreds of millions in 2024-2025.
First, deep technical accumulation
ZhiGe was founded in 2019 and has seven years of technical accumulation in the AR waveguide field.
Friends, you need to understand—waveguide technology isn't something you can catch up to quickly just by burning money—it requires long-term trial and error, iteration, and process refinement.
ZhiGe serves Alibaba's Quark AI glasses, multiple consumer electronics and internet giants, and AR glasses unicorn companies—orders from these customers are the best technical endorsement.
Second, timing the "AI+AR" window
Before 2023, the AR industry narrative was "metaverse," but that was a market requiring 5-10 years to mature.
After 2023, with the explosion of AI large models, AR suddenly found its "killer app"—as a hardware carrier for AI.
ZhiGe CEO Meng Xiangfeng said: "From 'AR is virtual-real fusion, metaverse' to 'AI+AR glasses, which are AI glasses with displays.'"
This judgment transformed ZhiGe from a "long-term bet" to an "explosive trend about to happen."
Third, capacity first, seizing the high ground of scale
While other manufacturers were still polishing technology in labs, ZhiGe had already pulled capacity to the million-unit level.
In manufacturing, "being able to mass-produce" is itself a moat—because from lab to factory lie countless pitfalls, and these pitfalls can only be filled with time and orders.
Will 2026 Be the "iPhone Moment" for AI Glasses?
At the end of this article, I want to make a prediction for you.
ZhiGe CEO Meng Xiangfeng says: "AI glasses sales will hit new highs in 2026."
Behind this prediction is key support: 2026 will be the turning point when AI glasses transform from "novelty products" to "daily tools."
Let me give you several signals:
- Top vendors entering the market: Meta, Alibaba, Xiaomi, ByteDance are all making AI glasses
- Supply chain maturity improving: production capacity for waveguides, optical modules, batteries—all climbing in every link
- User awareness being established: the hot sales of Ray-Ban Meta glasses prove the market exists
And in this process, "shovel sellers" like ZhiGe will be among the biggest beneficiaries.
Why? Because no matter which vendor's AI glasses win, they all need waveguides.
This is like the smartphone era—no matter whether Apple, Samsung, or Xiaomi won, Corning's Gorilla Glass and TSMC's foundry services always had business.
I want to say one thing in conclusion:
Technological development is never linear—it requires waiting for the "right time."
AR glasses were misunderstood for so many years, not because they weren't good, but because they were born at the wrong time—without AI, they were just a flashy toy without direction; with AI, they become the key to opening the future.
And ZhiGe Technology's story precisely confirms this principle:
True technological revolutions often happen at the moment when "technical capability" and "application scenarios" perfectly align.
In 2026, when you walk down the street and see more and more people wearing "ordinary-looking" glasses, my prediction will have come true.
Friends, let's wait and see.
分享文章
3篇相关文章
FCC Leak Reveals Two New Model Numbers: Meta's Third-Gen Ray-Ban Smart Glasses Are Coming
2026-03-30
Model numbers RW7001 and RW7002 surfaced in the FCC database. Meta and EssilorLuxottica's next-generation Ray-Ban smart glasses could launch within weeks.
How This Chinese Company Became Global No.1 in AI Glasses
2026-03-23
While Meta, Apple, and Google are still figuring it out, this Chinese company has quietly taken the global lead.
Headphones with Eyes? Razer Project Motoko Shakes Up AI Wearables Ultimate Form Battle
2026-01-07
At CES 2026, Razer's Project Motoko concept headphones shattered our preconceptions about AI hardware: who says visual AI must sit on your nose bridge?