Americas

  • United States
abednarz
Executive Editor

4 essential edge-computing use cases

Feature
Sep 14, 202018 mins
Enterprise StorageInternet of ThingsNetworking

Placing processing power and storage at the edge of enterprise networks takes many forms but delivers faster response times and can reduce the need for WAN bandwidth.

IDG Tech Spotlight  >  Edge Computing [ Network World / September 2020 ]
Credit: Structuresxx / Getty Images

Edge computing means different things to different players. But one thing is constant: Location matters.

Edge computing enables autonomous mining equipment to react to unexpected conditions a mile below the surface, even when disconnected from a network. When a hotel guest places a food order from a mobile phone and wants to have it delivered poolside, edge computing makes it possible to steer servers to the guest’s lounge chair.

Sensors, smart devices, and mobile users are proliferating across all industries. Enterprises are investing in edge deployments to combat growing amounts of decentralized data that need to be processed in place. When low latency is essential, edge setups take the delay out of moving data to a data center or public cloud for processing.

Another factor is the cost of backhauling data to a central location. Processing and analyzing locally can translate to less need for expensive bandwidth.

When more immersive interactions are sought, edge computing can put real-time data production closer to where people and machines exist. By 2022, more than 50% of enterprise-generated data will be created and processed outside the data center or cloud, according to Gartner.

At the same time, edge deployments are not islands. Edge infrastructure is typically managed centrally, via an enterprise data center or the cloud. Edge services might be delivered by content-delivery networks, colocation providers, and cloud service providers that are extending their footprints with regional data centers, local micro data centers, and compute capacity tacked onto telecom towers. Enterprise IT is tasked with integrating all the pieces – sensors, mobile devices, access points, gateways, edge servers, storage and networking gear – and managing operations that extend from the edge to on-prem and cloud data centers.

With edge computing, enterprises tend to start with a single, narrow use case, led by a vendor or system integrator, according to Gartner. But that’s just the beginning. Over the next few years, enterprises are expected to expand to a wide range of edge-computing use cases. By the end of 2023, more than 50% of large enterprises will deploy at least six edge-computing use cases for IoT or immersive experiences, compared to less than 1% in 2019, Gartner predicts.

Four enterprises in different industries shared their edge-computing endeavors and how they’re realizing gains in employee safety, productivity, customer service and revenue. Here’s what we learned.

AI at the edge speeds freight-train maintenance

Idle time is the enemy of freight-train operations.

Railroad companies measure freight-train performance in terms of average velocity, which generally hovers in the 35-37 mph hour range, according to Scott Carns, COO at Duos Technologies. “If they can gain one mile-per-hour in velocity, that equates to about $10 million in profitability. So there’s a huge push to invest in technologies that make them more efficient.”

One thing that slows freight trains is inspections. As freight trains cross the rail lines in North America, operators have to make time for mandatory car inspections. Historically, that would require people to conduct visual and physical examinations of mechanical components on the cars while they sit idle in a railyard. Duos is working to streamline that process using sensors, imaging and analytics deployed in the field.

The Jacksonville, Fla.-based company is building rail-inspection portals that stand like giant industrial carports. Instead of getting a physical examination in an inspection yard, a train passes through a Duos portal and undergoes an automated inspection without having to slow its speed. The portals are outfitted with high-powered LED lights to illuminate the trains and with cameras that can capture images at high speeds.

“We build these big portals, and trains go through at 40, 50, 60 miles per hour. We image the entire train from all sides, and then we run AI algorithms to look for what we call use cases, or mechanical defects, on the rail car,” Carns says. The system can detect oil leaks, damaged parts, and missing hatches, for example, and pinpoint the issues that need fixing.

“We capture every rail car from nine different camera perspectives,” Carns says. Each camera generates about 1GB of data, so a single car generates 9GB. An average train is about 120 cars, which puts the data tally at about 1TB per train. The high volume of data, combined with the inconsistency of available WAN bandwidth at remote track-side sites around the country, makes it impractical to send the data for processing in a central data center.

Adjacent to the portal is a small hut for IT systems. Two 45U racks contain ruggedized Dell XR2 servers and PowerVault storage. “We essentially build a small data center on the track side to process all that data at the edge,” Carns says.

AI processing starts immediately. To comply with rail safety regulations, the inspections have to be conducted quickly so that any needed repairs can be made when the trains arrive at the nearby railyard. “We have to do it in near real-time, because the train has to be inspected before it gets to its destination.”

In addition, depending on the location of the portal, there might be 30 to 40 trains that pass through each day. “You have to be done processing before the next train comes through,” Carns says. “That’s a huge, high-data operation that takes place track-side.”

Duos migrated from Dell’s PowerEdge VRTX blade servers to the XR2s about 18 months ago. The XR2 platform is enabling Duos to make its systems more modular, Carns says. The ruggedized aspect is also critical, given the environmental conditions the servers have to withstand. For example, one of the Duos portals is deployed in Winnipeg, Canada, where temperatures can climb to 90 degrees in the summer and plunge to minus 40 in the winter.

“One of our biggest challenges is maintenance and serviceability,” Carns says. “Outside of the military and marine applications, the railroad is probably about as far to the edge as you can get.”

Looking ahead, Duos plans to continue to expand its buildout of portals. Over the next few years, the company expects to have 60 or 70 systems in production, Carns says.

On the development side, the company continues to expand the range of its AI applications. Most recently, Duos began prototyping an application that uses thermal imaging to capture and analyze how electrical traction motors, which are mounted under the train cars, operate at different temperatures. The goal is to improve predictive maintenance for these costly parts that tend to break and burn out, Carns says. “Nobody had this data point before.”

Location-aware apps add to hotel room smarts

Sometimes, doing more at the edge can mean subtracting hardware. At Nobu Hotels, an upgraded wireless infrastructure is enabling more IoT and AI-driven applications with less proprietary hardware.

Nobu is standardizing on network infrastructure for its hotels that will allow it to converge Wi-Fi access with a range of applications for guests and employees, including content delivery, lighting controls, door locks, and safety alarms. At the same time, it’s enabling Nobu to streamline its onsite hardware requirements, which saves time and money, says Rodney Linville, global corporate director of IT at Nobu Hospitality.

“To simplify opening a hotel and to be able to integrate the technologies at the property – I needed to make sure that the technology chosen was able to do that,” Linville says. Not sometime in the future, but right out of the gate. “Aruba had their foot forward.”

Nobu deployed Aruba gear beginning with three new hotels in Chicago, Warsaw and London. It’s using Aruba edge and core switches, ClearPass Policy Manager for role- and device-based secure network access control, and location-ready access points with support for IoT devices running Wi-Fi, Zigbee, Bluetooth, and third-party protocols.

The network infrastructure supports administrative applications, such as the hotel’s mobile point-of-sale (POS) and property-management systems, as well as guest applications. It delivers all the technology within the rooms – door locks, climate control, hospitality services, and more. Using their own mobile devices, guests can check in or out of the property, order room service, request linens, or raise the blinds. Employees can monitor room access, reprogram locks, or call for help if there’s an emergency.

“Being able to have that connectivity anywhere in the room and being able to control those services in the room – that’s where our rooms are going,” Linville says. The goal is to allow guests to use their own phones or wireless devices to manage not only hotel-wide services such as hospitality but also in-room, location-dependent systems such as the door lock, TV or air conditioner.

To make that happen, the IT team is pairing the Bluetooth beacon and Zigbee radio technology that’s built into Aruba’s access points with IoT and analytics applications that generate, analyze and act upon data in real time.

Nobu initiated the integration of Aruba’s infrastructure with access-control technology from ASSA ABLOY Global Solutions, for example. The ASSA ABLOY technology lets guests use their smartphones for contactless check-in and to unlock their rooms. In the past, Nobu had to deploy dedicated in-room gateways for the ASSA ABLOY contactless entry systems. With the integration, Nobu can use the Aruba access points, rather than separate Zigbee gateways, to secure communications between the in-room smart door locks and the centralized lock-management software.

Linville pushed for ASSA ABLOY to use the Aruba hardware instead of the proprietary gateways. “They would have, almost, another network inside the hotel just to control the locks,” Linville says. “I don’t want to pay for that when I have the technology already at the hotel. Why can’t you guys just talk?”

Another key project was getting the Aruba gear to work with a third-party employee-safety application. React Mobile makes portable panic buttons that are designed to alert security personnel if a hotel employee encounters a dangerous or threatening situation such as an abusive guest or a medical emergency. The alarm system incorporates location-based services, and in the past, it required IT to deploy and maintain a separate overlay network with proprietary Bluetooth beacons installed in the rooms.

By integrating the systems, Nobu can use Aruba’s IoT-enabled access points to provide the necessary connectivity: If a worker presses the “help” button on the React Mobile smartphone app, Bluetooth beacons in the Aruba access points enable the app to determine the user’s location, which is sent via Wi-Fi to summon help.

Again at Nobu’s prompting, the vendors worked together to integrate their systems in time for the opening of the hotel’s new Chicago facility. It required some trial and error to deploy it, Lineville says, but the effort paid off the next time. “In Warsaw it was flawless.”

Aruba’s willingness to invest in R&D and explore AI possibilities was a big draw, Linville says. “One of the reasons we chose Aruba is because they are as eager to do these types of integration as I am. They see the value in it.”

Another feature that drew Nobu to the Aruba platform is security. The Aruba gear has integrated role-based firewall and intrusion prevention capabilities, which help Nobu stay compliant with Payment Card Information (PCI) data-security requirements for its wireless point-of-sale application, for example. Other enterprise vendors can offer that kind of security, but it would require another appliance in the network, Linville says. “If I can utilize that in their product, I’d rather use that than add another piece of hardware.”

HCI at the edge streamlines grocery store IT

A trade-show demonstration of a hyperconverged infrastructure (HCI) platform caught Jeff Miller’s eye several years ago, but the size of the system was a dealbreaker. “I thought, ‘that’s perfect, except I need it about 10 times smaller,'” Miller recalls. He stayed in contact with the vendor, Scale Computing, and kept tabs on his options until a form factor that worked for his needs became available.

Miller is director of IT at Jerry’s Foods, a retailer with 50 grocery, liquor and hardware stores in Minnesota and Florida. In the past, each location had virtual servers and a SAN. The legacy systems were expensive and difficult to manage. Now, Jerry’s Foods is nearing the end of a project to deploy Scale Computing’s HE150 edge devices in its locations – a move that’s enabling the company to rearchitect its whole in-store IT strategy, Miller says.

The HE150 is an all-flash, NVMe storage-based compute appliance that’s built off the Intel NUC mini PC. It measures roughly the size of three smartphones stacked on top of each other. It includes disaster recovery, high-availability clustering, rolling upgrades and integrated data protection, “all local and in a very small footprint,” Miller says.

Local processing is critical for a few core store applications. Product information needs to be managed onsite, for example. The stores carry between 65,000 and 70,000 items, which are tracked in a database that’s constantly being updated as suppliers make changes to their products. POS transactions depend on access to that data, which includes UPC codes, vendor numbers, product categories, and prices. “All of that has to reside local and be able to be looked at locally,” Miller says.

Another critical on-site application is the electronic benefit transfer (EBT) system, which requires a link between the local POS app and state-run systems to verify the availability of funds. The EBT application runs on the HE150 infrastructure, which is clustered for high availability, and pings the state-run system to confirm the user’s benefits balance. If shoppers can’t use EBT funds, they’ll shop somewhere else, which can have a big impact on sales at busy Jerry’s Foods locations. “If EBT goes down at the Lake Street store in Minneapolis, for example, in less than an hour we could be out $40,000,” Miller says.

For the most part, the in-store workloads today aren’t compute heavy or intelligence heavy, but that’s due to change. For example, one of the retailer’s stores was severely damaged during the riots in Minneapolis this summer. It’s being rebuilt to include enhanced building-security systems that will put more of a load on local IT resources. At other locations, Jerry’s Foods is considering installing solar rooftops as part of an effort to reduce energy costs, and, “If we do that, there will be a ton of intelligence around HVAC,” Miller says.

With the edge appliances, Jerry’s Foods can customize which applications run at each location. At sites with greater edge-processing requirements, Miller plans to deploy Scale Computing’s larger HCI appliances. Right now, deciding which unit to use is dependent on how much video will need to be stored locally. “It’s still edge to us, whether it’s big edge or small edge,” Miller says.

In the big picture, the Scale Computing appliances have slashed the time IT spends managing in-store infrastructure because there is less of it, and the rollout is expected to reduce the cost per instance by 50% over five years compared to the legacy virtualization system. “The nice thing about Scale, all of that [legacy infrastructure] is gone. We are 100% out of the VMware business,” Miller says.

Gone, too, are a number of proprietary hardware devices. In the past, stores had to maintain dedicated hardware for specific applications, such as fuel rewards and the extra coupons that print with a receipt. “We had one-offs for all kinds of stuff,” Miller says. “All of those different things were point solutions with different servers. We’re pulling all of that back so we’ll be able to slim down the actual hardware needs at the register and at the edge.”

Jerry’s Foods still has a few stores left to convert to the new edge-infrastructure model, which will be done by the spring of next year, Miller says. Looking ahead, he’s considering tying cloud storage to the stores’ edge-run applications. “We could use a small-scale device for those applications that have to be local – POS and antivirus and PCI defense – but we could push all of that storage need off to the cloud,” Miller says. “I see that as the future.”

Mining environment highlights IT/OT convergence

Engineers at Boliden are mining for copper, zinc, lead, gold and silver. But they’re doing it from the comfort of an above-ground control room rather than from tunnels below the surface.

At the Boliden site in Garpenberg, Sweden, autonomous mining equipment operates in areas that could be unsafe to send people. Laden with dust and prone to water infiltration, toxic gasses, and vibration, it’s an environment that’s inhospitable to communications gear, too.

Automation has been transforming the mining industry for many years. Sandvik Mining and Rock Technology manufactured its first autonomous mining equipment as early as 2004. Today, the state-of-the-art is a fully autonomous fleet of equipment running in an underground mine. “The [onboard] computers are controlling the machines independently, without any human interaction needed,” says Petri Mannonen, product line manager of mining information management at Sandvik, which provides gear and services for jobs including rock drilling, crushing and screening, loading and hauling, tunneling, and demolition.

The main driver of mining modernization is employee safety. “Underground mining is a hazardous environment. Getting people away from that hazardous environment increases the safety,” Mannonen says.

Improving operational efficiency is another key target.

As mining technology has advanced, so too have the requirements for network connectivity – it’s not just a laptop that’s running underground, it’s network-connected machinery that’s critical to operations. As mines become smarter, mining companies are extending connectivity from small dedicated networks to mine-wide networks. With more pervasive connectivity, mining companies can use proximity detection to keep people, equipment and other resources safely distanced, for example. In an emergency, more reliable connectivity can help coordinate communications and response.

At Boliden’s Garpenberg site, Sandvik supplies the mining equipment along with operational software, which depends on Cisco’s industrial IoT networking technology – both wired and wireless – for connectivity. Sandvik’s AutoMine software enables autonomous control of underground mining equipment as well as teleremote control so mining engineers in the control room can track resources and make adjustments as necessary. Boliden is also using Sandvik’s OptiMine software, which collects data from the mining equipment and analyzes it to optimize production.

On the network side, Boliden operates a low-latency IoT network using Cisco industrial switches and access points. Keeping the network connected underground is imperative. “There really needs to be a robust and reliable communications network from the equipment both to the control room and to the safety solution to be able to ensure that the environment and machines are operating safely,” Mannonen says.

That connectivity has helped transform mining operations in general, and Sandvik’s business model in particular, says Dave Wilson, managing director of IoT global sales at Cisco.

“Now that we’re able to provide networking in these harsh environments that these machines can connect to, [Sandvik] has been able to reimagine and evolve their business model,” Wilson says. “They can make their vehicles smarter. They can have different pieces – tools and sensors and software – that enable them to turn their vehicle into one big sensor that analyzes what’s going on. Then you can optimize the whole process.”

The setup at the Garpenberg mine, a mile underground, allows mining engineers to track people and equipment assets in real time. Onboard sensors are used for navigational purposes as well as to track the equipment’s health and make sure it’s operating at optimal levels. Data collected from the mines also informs operational decisions; it can be used to streamline scheduling, for example.

It’s transformative for mining operations. “That truck that’s autonomous – it can go on, the ‘driver’ isn’t going to be fatigued,” Wilson says. “They can get accuracy down to the centimeters in these mines. They can go to places they wouldn’t have been able to mine before.”

Some calculations are done independently at the farthest edges of the mines. Equipment might be running in an area that doesn’t have continuous network coverage, so the analysis can be done onboard.

More advanced analytics are done above ground. “Typically, the IoT-type of data collection from the equipment itself is either sent to the local central repository where the data is stored” or it can be sent for analysis in the cloud, Mannonen says.

As edge environments go, mining highlights the trend toward a convergence of IT systems and operational technology. Instead of digitizing the corporate branch and campus edge, “what we’re doing now with these technologies is allowing the edge of the world and the businesses to digitize, which is really the heart of the business,” says Wilson.