One Stock to Buy as AI Moves to the Edge VIEW IN BROWSER  | BY KEITH KAPLAN CEO, TRADESMITH | The new John Deere X9 1100 combine harvester costs upward of $900,000. It can harvest 7,200 bushels of corn an hour without a human hand on the wheel. And the machine isn’t just driving itself. Cameras on the front attachment scan each plant as it passes – thousands of images a minute. An onboard processor analyzes them, flags stressed plants, and adjusts the herbicide spray row by row… all before the combine has moved another foot down the field. Many combines operate miles from the nearest cell tower. For a machine making thousands of decisions a minute, that could be a fatal flaw. It isn’t in the John Deere X9. It doesn’t send data to a distant server to be processed the way ChatGPT does. The intelligence is programmed in before the combine leaves the barn – and it runs entirely on its own from there. The same principle is at work in one of the world’s most cutting-edge auto plants. At BMW’s Dingolfing facility in Bavaria, Germany, AI systems scan hundreds of components a minute as they move down the line. They’re checking for hairline cracks and surface flaws no human eye could catch at that speed. There’s no time to wait for a distant server’s analysis. So BMW runs the analysis at the inspection point – making pass/fail calls before the next part arrives. Intelligence has to be local in the operating room, too. In robot-assisted surgery, AI systems guide instruments in real time. They adjust for the surgeon’s hand tremor, flag tissue boundaries, and respond to movement faster than any human reflex could. A round trip to and from a remote server could add hundreds of milliseconds of delay… and put patients’ lives at risk. These systems all run on something called Edge AI. It’s a kind of artificial intelligence that lives inside the machine itself, not in a distant data center. It works without an internet signal, responds in milliseconds, and gets smarter the more it’s used. And we need to pay attention as investors. Edge AI is shaping up to be one of the next phases of the AI boom – one most investors are missing because they’re still focused on the stocks that dominated the last one. Nvidia (NVDA) owned the AI trade by supplying chips to giant data centers. Now, a race is underway to build a different kind of chip to power AI at the edge. And as we’ll look at today, the winner of that race doesn’t need the AI hype cycle to keep running. It just needs the world to keep automating. Cloud vs. Edge AI: What’s the Difference? When you type a question into ChatGPT, your words leave your device, travel to a far-off data center, get processed by Nvidia’s chips, and return to your screen. The round trip takes a second or two. For a chat app, that’s fine. But now imagine that delay in the John Deere X9. Or on BMW’s production line. Or in an operating room. Another important use case for Edge AI is self-driving cars. A Tesla navigating a highway intersection has to spot pedestrians, read traffic signals, judge the speed of oncoming vehicles, and decide whether to brake or accelerate – all simultaneously, in less than 100 milliseconds. That’s faster than the blink of an eye. If it had to ping a data center for every one of those calls, it would crash before the answer came back. Think of it this way. Cloud AI is a genius locked in a room miles away. You can call him, but there’s always a delay. Edge AI is a capable deputy riding along in the machine – always present, always ready, never dependent on a signal. Why Edge AI Is About to Explode Right now, most AI spending goes into cloud infrastructure – the data centers and Nvidia chips that power screen-based chatbots like ChatGPT. That’s the trade most investors have been riding. But Edge AI is catching up fast. By 2030, the market for this new kind of AI is projected to grow from $11.8 billion today to $56.8 billion – nearly five times its current size. And that’s no surprise when you consider how many devices need to make real-time decisions without relying on a data center: - Autonomous vehicles must detect obstacles, read road signs, and make split-second decisions – without a cell signal in a tunnel or a dead zone on the highway.
- Industrial robots in factories and warehouses run 24 hours a day on lines where a lag of milliseconds means a defective part or a safety incident.
- Smart security cameras in airports, hospitals, and city infrastructure analyze footage in real time without sending every frame to a remote server.
- Smartphones and laptops already process voice commands, face recognition, and photo editing using AI without sending your data anywhere.
- Power grids have AI systems that monitor for faults and make automatic corrections faster than any human operator could respond.
- Satellites in orbit must process sensor data and make operational decisions on their own without sending data all the way back to Earth.
The world is filling up with machines that need to think for themselves. Every one of them needs a chip that can process information on the spot – without phoning home to a data center. Investors watching Nvidia’s data center revenue and calling it the AI trade are missing the bigger picture. The longer-duration opportunity is in the billions of devices that will need Edge AI chips over the next decade – in cars, factories, farms, hospitals, homes, battlefields, and space. How to Play It One of the companies at the center of this shift isn’t a household name the way Nvidia is. But if you’ve used a smartphone in the last decade, you’ve used its technology. ARM Holdings (ARM) designs the chip architectures that power most of the world’s mobile devices. ARM doesn’t manufacture chips. It licenses its designs to Apple, Qualcomm, Samsung, and hundreds of others – who build their own processors on top of its blueprints. That model has made ARM’s architecture the dominant standard in low-power computing. And low power is exactly what Edge AI demands. Data center chips burn through enormous amounts of electricity. That’s fine when a chip sits in an air-conditioned facility connected to the grid. It’s a problem when it sits inside a combine harvester, a surgical robot, or a car. In January 2026 at CES – the world’s biggest consumer technology trade show – Nvidia CEO Jensen Huang said that the “ChatGPT moment for physical AI is here.” He made those remarks specifically in the context of ARM-powered devices… and for a reason. Virtually every major chipmaker building for the edge – including Nvidia – licenses ARM architecture. ARM doesn’t compete with other chipmakers. It supplies the foundation they all build on. That’s a fantastic business to be in. As the number of AI-enabled devices scales from billions to tens of billions over the next decade, the royalty stream flowing back to ARM scales with it. Every autonomous vehicle, every smart camera, every next-generation smartphone that ships with an AI processor is likely running on an ARM design. ARM, like other stocks in the semiconductor industry, is in a Short-Term Health Yellow Zone. That’s a caution sign that its bullish uptrend could be breaking down. So this isn’t a “buy today and check back in a week” situation. But even if the trend is unclear right now, ARM is an elite company. That’s going by our Quantum Score. It rates stocks on a 0–100 scale across two factors: business fundamentals and technical momentum. On the fundamental side, it looks at things like revenue growth, earnings growth, profit margins, and free cash flow – both recent and sustained over three years. ARM’s Fundamental Score is 90. That puts it among the top fraction of the more than 10,000 stocks our system tracks. A score of 90 says ARM is growing, profitable, and generating real cash – the kind of company that tends to reward patient investors who buy at the right moment. So if you believe the physical world is about to get a lot smarter – and the evidence says it is – ARM is one of the clearest ways to own that trend as a long-term hold. All the best, 
Keith Kaplan CEO, TradeSmith P.S. I recently posted some thoughts on X about other ways to play the AI boom without touching crowded trades like Nvidia. Yesterday, I wrote about how the Utilities Select Sector SPDR Fund (XLU) has held steady as a rock while stocks around the world dropped. The story here is simple. AI data centers are consuming enormous amounts of electricity. That electricity has to come from somewhere, and utilities are the ones supplying it. This is a long-term, structural demand driver, not a short-term trade. I also wrote about another AI energy play that pays you an annual yield of 7.5%. That’s nearly twice what you’ll earn on a 10-year Treasury note, with AI upside in the mix, too. To catch my latest ideas, make sure to follow me on X @KeithTradeSmith. |
0 Response to "One Stock to Buy as AI Moves to the Edge"
Post a Comment