A Universe Full of Opportunities, for AI and Supercomputing

The main reason, of course, is that we’re now seven months into the year-long mission of HPE’s Spaceborne supercomputer on the International Space Station (ISS). The first high performance commercial off-the-shelf (COTS) computer system ever sent into space, the Spaceborne mission is to see if an onboard supercomputer can support astronauts on a lengthy journey to Mars someday. So far, so good. The system – with three concentric rings of intelligence built into it – is designed to self-govern. It will shut off if it senses it might break. (Something we all could learn!). It does so with the intent of remaining workable for the astronauts throughout the journey.

Spotlight

Railinc Corp.

Freight railroads are the backbone of industry and commerce in North America. Every-day items from food and clothes to industrial materials and chemicals travel by rail across our great land. Our people make the freight railroad industry safer, more productive and more efficient through their data and IT expertise, and Railinc's information services. We keep your career and the North America freight rail system moving. Want to get inside our company/ read all about us on the Railinc tracks blog here: https://www.railinc.com/rportal/railinc-tracks-blog

OTHER ARTICLES
Software, Future Tech, Application Development Platform

Are Telcos Ready for a Quantum Leap?

Article | July 24, 2023

Quantum technologies present both an opportunity for telcos to solve difficult problems and provide new services and a security threat that could require extensive IT investment. Are Telcos Ready for a Quantum Leap? When Andrew Lord, Senior Manager, Optical Networks and Quantum Research at BT, first started presenting quantum technologies at customer events six or seven years ago, his was the graveyard shift, he says, entertaining attendees at the end of the day with talk of 'crazy quantum stuff.' "But that is no longer the case," says Lord. "Over the last two years, I've noticed a shift where I now speak before lunch, and customers actively seek us out." Two developments may be causing the shift: Customers’ growing awareness of the threats and opportunities that quantum computing presents, plus a recent spike in investment in quantum technology. In 2022, investors plowed $2.35 billion into quantum technology startups, which include companies in quantum computing, communications and sensing, according to McKinsey. The public sector has also been digging deep into its pockets. Last year, the United States added $1.8 billion to its previous spending on quantum technology, and the EU committed an extra $1.2 billion, the consultancy noted, while China made total investments of $15.3 billion. According to Luke Ibbetson, Head of Group R&D at Vodafone, quantum computing's promise lies in solving a probabilistic equation within a few hours. This task would take a classical computer a million years to accomplish. This breakthrough would enable telcos to address optimization problems related to network planning, optimization, and base station placement. The flip side is that a powerful quantum computer could also break the public-key cryptography that protects today’s IT systems from hackers. As a spokesperson at Deutsche Telekom remarks: “Telcos will have to react to the threat of quantum computers to communication security because their core business model is at risk, which is offering secure digital communications.” The idea of quantum computing posing a security threat is not new. In 1994, Peter Shor, a mathematician working at AT&T Bell Labs, showed how a quantum computer could solve the logarithms used to encrypt data. “His work simultaneously ignited multiple new lines of research in quantum computing, information science, and cryptography,” according to an article by the Massachusetts Institute of Technology, where Shor is currently working. Beyond The Lab What has changed nearly thirty years on is that quantum computing is creeping out of the lab. Sizeable obstacles to large-scale quantum computing, however, remain. Quantum computers are highly sensitive to interference from noise, temperature, movement or electromagnetic fields and, therefore, very difficult and expensive to build and operate, especially at scale: IBM’s latest quantum processor, for example, operates at a very low temperature of approximately 0.02 degrees Kelvin. When Deutsche Telekom’s T-Labs tested telco use cases, it found quantum computing coped well with small problem statements. “However, when the problem size was scaled to real-world problem sizes, the quality of the QComp solution degraded,” according to the spokesperson. The company is now awaiting the next generation of quantum computing platforms to redo the analyses. All of this means, for now, quantum computers are not large and powerful enough to crack Shor’s algorithm. The question is, when will someone succeed? The Global Risk Institute tracks the quantum threat timeline. In its latest annual report, the organization asked 40 quantum experts whether they thought it likely that within the next ten years, a quantum computer would break an encryption scheme like RSA-2048 in under 24 hours. Over half the respondents judged the event to be more than 5% likely, and almost a quarter considered it to be more than 50% likely. Any breakthrough will come from a relatively small number of actors. Today, governments and academic institutions are home to around half of the 163 projects accounted for worldwide by Global Quantum Intelligence, a research and analysis company, according to its CEO, André M. König, with big technology companies and specialized startups accounting for the rest. Q2K Nonetheless, the impact of quantum computing could be widespread, even if relatively few of them are built. The challenge of preparing for a post-quantum future is often called Q2K in reference to the Y2K bug. In the late 1990s, many (but not all) governmental organizations and companies spent millions of dollars on Y2K systems integration to ensure that IT programs written from the 1960s through the 1980s would be able to recognize dates after December 31, 1999, all while being uncertain of the scale or the impact of the risk if they didn’t. ‘Q2K’ differs in that there is no specific deadline, and the dangers of a major security breach are much clearer cut. However, it is similar in demanding a lot of work on aging systems. “Cryptography is used everywhere,” points out Lory Thorpe, IBM’s Director of Global Solutions and Offerings, Telecommunications. She adds, “Because telco systems have been built over periods of decades, people don’t actually know where cryptography is being used. So, if you start to look at the impact of public key cryptography and digital signatures being compromised, you start to look at how those two things impact open source, how that impacts the core network, the radio network, [and] OSS/BSS, network management, how the network management speaks to the network functions and so on.” This complexity is why some analysts recommend that telcos take action now. “You’re going to find tens of thousands of vulnerabilities that are critical and vulnerable to a quantum attack. So, do you have to worry about it today? Absolutely - even if it’s in 2035,” says König. “Anyone who has ever done [IT implementation projects], and anyone who’s ever worked in cybersecurity [knows], tens of thousands of vulnerabilities that are critical [requires] years and years and years of just traditional integration work. So, even if you’re skeptical about quantum, if you haven’t started today, it is almost too late already.” Don’t Panic! For the past two to three years, Vodafone has been preparing to migrate some of its cryptographic systems to be quantum-safe, according to Ibbetson. He believes there is no need to panic about this. However, telcos must start planning now. König said, "The telecoms industry as a whole is not moving as quickly as some other sectors, notably the banking, pharmaceutical, and automotive industries. In these sectors, post-quantum security planning often involves CEOs at a very strategic level." For this reason, Vodafone joined forces with IBM in September 2022 to establish the GSMA Post-Quantum Telco Network Taskforce. “Even though many industries are preparing to be able to defend against future quantum threats, we didn’t see anything happening particularly in in the telco space, and we wanted to make sure that it was a focus,” says Ibbetson. “Obviously it will turn into an IT-style transformation, but it’s starting now with understanding what it is we need to mobilize that.” AT&T has also been working to pinpoint what needs to be addressed. Last year, the company said it aims to be quantum-ready by 2025, in the sense that it will have done its due diligence and identified a clear path forward. Minding Your PQCs Companies across multiple sectors are looking to post-quantum cryptography (PQC) to secure their systems, which will use new algorithms that are much harder to crack than RSA. König contends that PQC needs to become “a standard component of companies’ agile defense posture” and believes the development of PQC systems by software and hardware companies will help keep upgrade costs under control. “From a financial point of view, vendors do a fantastic job bringing this to market and making it very accessible,” says König. Lord, who has been researching quantum technologies at BT for over a decade, is also confident that there is “going to be much more available technology.” As a result, even smaller telcos will be able to invest in securing their systems. “It doesn't need a big boy with lots of money [for] research to do something around PQC. There’s a lot of work going on to ratify the best of those solutions,” says Lord. There are several reasons why eyes are on software based PQC. Firstly, it can be used to secure data that was encrypted in the past, quantum computing advances will make vulnerable in the future. In addition, the quantum-based alternative to PQC for securing network traffic called quantum key distribution (QKD), comes with a huge drawback for wireless operators. QKD is hardware-based and uses quantum mechanics to prevent interception across optical fiber and satellite (i.e., free space optical) networks, making it secure, albeit expensive. But for reasons of physics, it does not work on mobile networks. Setting Standards Given the importance of PQC, a lot of effort is going into standardizing robust algorithms. The political weight of the US and the size of its technology industry mean that the US government’s National Institute of Standards and Technology (NIST) is playing a key role in the technical evaluation of post-quantum standardization algorithms and creating standards. NIST expects to publish the first set of post-quantum cryptography standards in 2024. In the meantime, Dustin Moody, a NIST mathematician, recommends (in answers emailed to inform) that companies “become familiar and do some testing with the algorithms being standardized, and how they will fit in your products and applications. Ensure that you are using current best-practice cryptographic algorithms and security strengths in your existing applications. Have somebody designated to be leading the effort to transition. QKD There is no absolute guarantee, however, that a quantum computer in the future won’t find a way to crack PQC. Therefore, institutions such as government agencies and banks remain interested in using QKD fiber and satellite networks to ensure the highest levels of security for data transmission. The European Commission, for example, is working with the 27 EU Member States and the European Space Agency (ESA) to design, develop and deploy a QKD-based European Quantum Communication Infrastructure (EuroQCI). It will be made up of fiber networks linking strategic sites at national and cross-border levels and a space segment based on satellites. EuroQCI will reinforce the protection of Europe’s governmental institutions, their data centers, hospitals, energy grids, and more,” according to the EU. Telecom operators are involved in some of the national programs, including Orange, which is coordinating France’s part of the program called FranceQCI (Quantum Communication Infrastructure). Separately, this month, Toshiba and Orange announced they had successfully demonstrated the viability of deploying QKD on existing commercial networks. Outside the EU, BT has already built and is now operating a commercial metro quantum-encryption network in London. “The London network has three quantum nodes, which are the bearers carrying the quantum traffic for all of the access ingress,” explains Lord. For example, a customer in London's Canary Wharf could link via the network to the nearest quantum-enabled BT exchange. From there, it joins a metro network, which carries the keys from multiple customers “in an aggregated cost-effective way to the egress points,” according to Lord. “It is not trivial because you can mess things up and [get] the wrong keys,” explains Lord. “You really have to be more careful about authentication and key management. And then it's all about how you engineer your quantum resources to handle bigger aggregation.” It also gives BT the opportunity to explore how to integrate quantum systems downstream into its whole network. “What I'm telling the quantum world is that they need to get into the real world because a system that uses quantum is still going to be 90%, non-quantum and all of the usual networking rules and engineering practices apply. You still need to know how to handle fiber. You still need to know how to provision a piece of equipment and integrate it into a network.” SK Telecom is also heavily involved in quantum-related research, with developments including QKD systems for the control and interworking of quantum cryptography communication networks. Japan is another important center of QKD research. A QKD network has existed in Tokyo since 2010, and in 2020, financial services company Nomura Securities Co., Ltd. tested the transmission of data across the Tokyo QKD network. As the EU’s project makes clear, satellite is an important part of the mix. Lord expects satellite-based QKD networks to come on stream as of 2025 and 2026, enabling the purchase of wholesale quantum keys from a dedicated satellite quantum provider. Back in 2017, China already used the satellite to make the first very long-distance transmission of data secured by QKD between Beijing and Vienna, a distance of 7,000km. Securing The Edge There are additional efforts to secure communications with edge devices. BT’s Lord, for example, sees a role for digital fingerprints for IoT devices, phones, cars and smart meters in the form of a physical unclonable function (PUF) silicon chip, which, because of random imperfections in its manufacture, cannot be copied. In the UK, BT is trialing a combination of QKD and PUF to secure the end-to-end journey of a driverless car. The connection to the roadside depends on standard radio with PUF authentication, while transmission from the roadside unit onward, as well as the overall control of the autonomous vehicle network, incorporate QKD, explains Lord. SK Telecom has developed what it describes as a quantum-enhanced cryptographic chip with Korea Computer & Systems (KCS) and ID Quantique. Telefónica Spain has partnered on the development of a quantum-safe 5G SIM card and has integrated quantum technology into its cloud service hosted in its virtual data centers. Given China’s heavy investment in quantum technologies, it is no surprise to see its telecom operators involved in the field. China Telecom, for example, recently invested three billion yuan ($434m) in quantum technology deployment, according to Reuters. Quantum in The Cloud Some of America's biggest technology companies are investing in quantum computing. Today, it is even possible to access quantum computing facilities via the cloud, albeit at on small scale. IBM's cloud access to quantum computers is free for the most basic level, rising to $1.60 per second for the next level. And it is just the beginning. America's big tech companies are racing to build quantum computers at scale. One measure of scale is the size of a quantum processor, which is measured in qubits. While a traditional computer stores information as a 0 or 1, a qubit can represent both 0 and 1 simultaneously. This unique property enables a quantum computer to explore multiple potential solutions to a problem simultaneously; and the greater the stability of its qubits, the more efficient it becomes. IBM has a long history in quantum research and development. In 1998, it unveiled what was then a ground-breaking 2-qubit computer. By 2022, it had produced a 433-qubit processor, and in 2023, it aims to produce a 1,121-qubit processor. Separately, this month, it announced the construction of its first quantum data center in Europe, which it expects to begin offering commercial services as of next year. Google is also firmly in the race to build a large-scale quantum computer. In 2019, a paper in Nature featured Google’s Sycamore processor and the speed with which it undertakes computational tasks. More recent work includes an experimental demonstration of it’s possible to reduce errors by increasing the number of qubits. Microsoft reckons that "a quantum machine capable of solving many of the hardest problems facing humanity will ultimately require at least 1 million stable qubits that can perform 1 quintillion operations while making at most a single error." To this end, it is working on what it calls a new type of qubit, a topological qubit. Amazon announced in 2021 an AWS Center for Quantum Computing on the Caltech campus to build a fault-tolerant quantum computer.

Read More
Software, Future Tech, Application Development Platform

Over the Waterfall to GitOps

Article | August 16, 2023

One of the first steps on the journey to cloud-native is transforming culture. This starts with embracing Agile methodology, followed by implementation of DevOps processes and eventually GitOps, as we explore in this extract from the recent e-book Mind the gap: bridging the skills divide on the journey to cloud native. Most CSPs agree that culture, including governance and skills, is the single biggest obstacle to adopting a cloud-native architecture. Traditional waterfall project management focuses on a linear progression, where one task or process needs to be completed before the next can start. This approach is time-consuming and costly, and it stifles innovation. It’s a major reason a CSP typically takes more than a year to develop a new service. Adopting Agile methodology is a completely new way of working that focuses on building cross-functional teams to speed innovation and service creation. This requires CSPs to seek individuals with the new project management skills and are adaptable and quick-thinking. Agile may not be suitable for every aspect of the business or for every project, but it is critical for moving to cloud-based, and eventually to cloud-native environments. Agile’s assumptions • Early, continuous delivery of software leads to happy customers • Changing requirements are always welcome, even in late development • Working software is delivered frequently • Business teams and developers work together every day • Projects are built around motivated and trusted individuals • Face-to-face is the best way to communicate • Working software is the principal measure of progress • Development is sustainable and constant • Attention to technical excellence and good design are required • Simplicity is essential • The best architectures emerge from self-organizing teams • Teams look for ways to be more effective and adjust accordingly There are lots of Agile approaches, but many CSPs use a model made popular by Spotify, which organizes teams into ‘squads,’ ‘tribes,’ ‘chapters,’ and ‘guilds.’ Vodafone Group follows this model and uses ‘very, very flat, non-hierarchical governance,’ according to Dr. Lester Thomas, Chief IT Systems Architect at Vodafone Group. “We’ve learned doing this in the digital space, but we’re trying to adopt that software approach right into our network.” Culture Eats Technology UScellular began adopting Agile methodology about five years ago, and the company is implementing cloud-native applications wherever they make sense. During its shift to the new way of working, cultural change has been the most difficult obstacle to overcome, significantly harder than technological change, according to Kevin Lowell, the company’s Chief People Officer and former Executive VP in charge of IT. The shift started with creating ‘a compelling why’ – in this case, improving how customers experience using UScellular services. The company replaced some waterfall processes with iterative Agile processes managed in scrums and implemented in sprints. The IT team also began meeting regularly with business stakeholders and educating them about how Agile works. Telecom Argentina is also embracing Agile. It is working with Red Hat to adopt a framework called Team Topologies to create a more efficient way of collaborating. The company is applying Team Topologies within its network division to create cross-functional teams that not only focus on the evolution and operation of technological platforms but also on creating and delivering services. From Agile to DevOps While Agile methodologies help to establish communication between IT teams and other stakeholders in the company, DevOps goes further by introducing an end-to-end software lifecycle that establishes a continuous flow of development, integration, testing, delivery and deployment. Google’s approach to DevOps, called Site Reliability Engineering (SRE), has been widely adopted in telecoms. It provides the foundation for the ODA Canvas, and it’s how Vodafone Group is implementing DevOps. Vodafone is a cloud-native pioneer. For the past several years, the company has been transforming into a platform provider, using what it calls a ‘telco-as-a-service’ or TaaS strategy. Vodafone is becoming a software company on its quest to become a techco, which involves hiring 7,000 software engineers to add to the existing 9,000 in the company. A key driver for embracing a cloud-native approach is “moving from our millions of human customers to billions of things,” says Thomas. Instead of offering just four primary services – fixed voice, broadband Internet, mobility and TV, he envisions using 5G network slicing to support thousands of IoT services per vertical market. “Unless we can drive this through software-driven approaches and automation, we’re not going to be successful,” he says. From DevOps to GitOps The problem with DevOps, however, is that most CSPs aren’t developing their software; they buy solutions from vendor partners. As Omdia’s James Crawshaw, Principal Analyst at Telco IT & Operations, notes in a research report, this makes it difficult for operators to create CI/CD pipelines that cut across organizational boundaries between CSPs and suppliers. To address this, CSPs “have adapted DevOps to their needs and created GitOps, which they use to take third-party applications and deploy on their own platforms,” Crawshaw explains. Philippe Ensarguet, Group CTO at Orange Business Services, recently explained that GitOps requires continuous integration and continuous operations or CI/CO. This means moving away from a prescriptive way of implementing operations to a declarative approach that supports full automation. What is GitOps? “If you rely mainly on the prescriptive approach, the day you want to move into production and scale up the number of applications you implement, you have to manage it purely with humans, and you hit the wall on scalability,” says Ensarguet. William Caban, Telco Chief Architect at Red Hat, sees GitOps as foundational to the concept of zero-touch, zero-wait and zero-trouble services, which will be orchestrated end-to-end in autonomous networks. “This is exactly what GitOps is about: event-driven, intent-based networks,” he says. “It becomes the operational model for architectures based on the ODA and autonomous networks.” CSPs must hire software and automation skills for GitOps. They also must reskill network experts, such as radio access network (RAN) engineers, to work in CI/CO teams so everyone uses common terminology. Some operators are going even further by creating centers of excellence (CoEs) where cross-functional teams from business, network and operations collaborate. “In GitOps, it is also necessary to codify team members’ knowledge, so that even as people move around or leave the company, the software development and operations lifecycle processes are not disrupted,” Caban says.

Read More
Neural Networks

Empowering Industry 4.0 with Artificial Intelligence

Article | September 15, 2023

The next step in industrial technology is about robotics, computers and equipment becoming connected to the Internet of Things (IoT) and enhanced by machine learning algorithms. Industry 4.0 has the potential to be a powerful driver of economic growth, predicted to add between $500 billion- $1.5 trillion in value to the global economy between 2018 and 2022, according to a report by Capgemini.

Read More

How Artificial Intelligence Is Transforming Businesses

Article | February 12, 2020

Whilst there are many people that associate AI with sci-fi novels and films, its reputation as an antagonist to fictional dystopic worlds is now becoming a thing of the past, as the technology becomes more and more integrated into our everyday lives. AI technologies have become increasingly more present in our daily lives, not just with Alexa’s in the home, but also throughout businesses everywhere, disrupting a variety of different industries with often tremendous results. The technology has helped to streamline even the most mundane of tasks whilst having a breath-taking impact on a company’s efficiency and productivity

Read More

Spotlight

Railinc Corp.

Freight railroads are the backbone of industry and commerce in North America. Every-day items from food and clothes to industrial materials and chemicals travel by rail across our great land. Our people make the freight railroad industry safer, more productive and more efficient through their data and IT expertise, and Railinc's information services. We keep your career and the North America freight rail system moving. Want to get inside our company/ read all about us on the Railinc tracks blog here: https://www.railinc.com/rportal/railinc-tracks-blog

Related News

General AI

AI Quality Leader TruEra Receives Investment from Hewlett Packard Enterprise

TruEra | June 23, 2022

Hewlett Packard Enterprise announced today that it has invested in TruEra through its venture capital program, Hewlett Packard Pathfinder. TruEra offers the first suite of AI Quality management solutions for managing model performance, explainability, and societal effect. The $25M Series B round TruEra announced in March 2022 is now extended. Hewlett Packard Pathfinder invests in market-leading start-ups, develops solutions fusing the technology of portfolio companies with Hewlett Packard Enterprise goods, and designs collaborative go-to-market strategies. Hewlett Packard Pathfinder also keeps a careful eye on longer-term disruptive innovation, supporting the development of cutting-edge technology. "We're excited to become an investor and to partner with TruEra in developing comprehensive solutions for our enterprise customers in conjunction with our High-Performance Computing offering," said Paul Glaser, Vice President and Head of Hewlett Packard Pathfinder. The quality challenge, the following significant AI challenge, is addressed by TruEra. In order to quickly correct problems and maintain peak performance, ML teams can use TruEra's solutions to explain, analyze, and test the performance, risk, and responsible AI characteristics of models early in the development process. As a result, numerous Fortune 1000 businesses have chosen TruEra as their preferred provider because of its distinctive, whole lifecycle approach to model quality. "AI model quality and ML Ops have emerged as considerable challenges for enterprises deploying and scaling machine learning models. Solving these challenges is imperative for AI success, and TruEra stands out for its deep expertise, differentiated technology and practical experience helping companies deliver and monitor AI applications." Ali Wasti, Managing Director, Hewlett Packard Pathfinder Top-tier investors such as Menlo Ventures, Greylock Partners, and Wing Venture Capital have contributed approximately $45 MM to TruEra. Anupam Datta and Shayak Sen, co-founders of TruEra, conducted academic research at Carnegie Mellon University that served as the basis for the company's technology. "Hewlett Packard Enterprise is a leading, trusted provider to the enterprise, and is known for its ability to ensure that cutting-edge innovation delivers proven results. We're looking forward to working closely with the HPE team as partners on customer engagements," said Will Uppington, CEO and co-founder, TruEra.

Read More

HPE accelerates Artificial Intelligence innovation with enterprise-grade solution for managing entire machine learning lifecycle

Hewlett Packard Enterprise | September 10, 2019

Hewlett Packard Enterprise (HPE) today announced a container-based software solution, HPE ML Ops, to support the entire machine learning model lifecycle for on-premises, public cloud and hybrid cloud environments. The new solution introduces a DevOps-like process to standardize machine learning workflows and accelerate AI deployments from months to days. Enterprise AI adoption has more than doubled in the last four years1, and organizations continue to invest significant time and resources in building machine learning and deep learning models for a wide range of AI use cases such as fraud detection, personalized medicine, and predictive customer analytics. However, the biggest challenge faced by technical professionals is operationalizing ML, also known as the “last mile,” to successfully deploy and manage these models, and unlock business value. According to Gartner, by 2021, at least 50 percent of machine learning projects will not be fully deployed due to lack of operationalization.

Read More

Hewlett Packard Enterprise advances the cloud experience through intelligence and composability

Hewlett Packard Enterprise | November 04, 2019

Hewlett Packard Enterprise (HPE) today announced combined intelligence and composability offerings by integrating its artificial intelligence (AI) and machine learning-driven HPE Primera storage platform with the composability in HPE Synergy and HPE Composable Rack1, helping customers rapidly deliver new apps and innovations to propel their businesses forward. This unique combination allows customers to deliver services on an intelligent cloud platform, offering the flexibility to support any application and service level agreement (SLA) with cloud-like agility, extreme resiliency, and seamless scalability. Additionally, expanding on the recent introduction of HPE Synergy support for VMware Cloud Foundation (VCF), HPE Composable Rack, the HPE composable rack-scale solution, now supports VCF for hybrid cloud deployments.

Read More

General AI

AI Quality Leader TruEra Receives Investment from Hewlett Packard Enterprise

TruEra | June 23, 2022

Hewlett Packard Enterprise announced today that it has invested in TruEra through its venture capital program, Hewlett Packard Pathfinder. TruEra offers the first suite of AI Quality management solutions for managing model performance, explainability, and societal effect. The $25M Series B round TruEra announced in March 2022 is now extended. Hewlett Packard Pathfinder invests in market-leading start-ups, develops solutions fusing the technology of portfolio companies with Hewlett Packard Enterprise goods, and designs collaborative go-to-market strategies. Hewlett Packard Pathfinder also keeps a careful eye on longer-term disruptive innovation, supporting the development of cutting-edge technology. "We're excited to become an investor and to partner with TruEra in developing comprehensive solutions for our enterprise customers in conjunction with our High-Performance Computing offering," said Paul Glaser, Vice President and Head of Hewlett Packard Pathfinder. The quality challenge, the following significant AI challenge, is addressed by TruEra. In order to quickly correct problems and maintain peak performance, ML teams can use TruEra's solutions to explain, analyze, and test the performance, risk, and responsible AI characteristics of models early in the development process. As a result, numerous Fortune 1000 businesses have chosen TruEra as their preferred provider because of its distinctive, whole lifecycle approach to model quality. "AI model quality and ML Ops have emerged as considerable challenges for enterprises deploying and scaling machine learning models. Solving these challenges is imperative for AI success, and TruEra stands out for its deep expertise, differentiated technology and practical experience helping companies deliver and monitor AI applications." Ali Wasti, Managing Director, Hewlett Packard Pathfinder Top-tier investors such as Menlo Ventures, Greylock Partners, and Wing Venture Capital have contributed approximately $45 MM to TruEra. Anupam Datta and Shayak Sen, co-founders of TruEra, conducted academic research at Carnegie Mellon University that served as the basis for the company's technology. "Hewlett Packard Enterprise is a leading, trusted provider to the enterprise, and is known for its ability to ensure that cutting-edge innovation delivers proven results. We're looking forward to working closely with the HPE team as partners on customer engagements," said Will Uppington, CEO and co-founder, TruEra.

Read More

HPE accelerates Artificial Intelligence innovation with enterprise-grade solution for managing entire machine learning lifecycle

Hewlett Packard Enterprise | September 10, 2019

Hewlett Packard Enterprise (HPE) today announced a container-based software solution, HPE ML Ops, to support the entire machine learning model lifecycle for on-premises, public cloud and hybrid cloud environments. The new solution introduces a DevOps-like process to standardize machine learning workflows and accelerate AI deployments from months to days. Enterprise AI adoption has more than doubled in the last four years1, and organizations continue to invest significant time and resources in building machine learning and deep learning models for a wide range of AI use cases such as fraud detection, personalized medicine, and predictive customer analytics. However, the biggest challenge faced by technical professionals is operationalizing ML, also known as the “last mile,” to successfully deploy and manage these models, and unlock business value. According to Gartner, by 2021, at least 50 percent of machine learning projects will not be fully deployed due to lack of operationalization.

Read More

Hewlett Packard Enterprise advances the cloud experience through intelligence and composability

Hewlett Packard Enterprise | November 04, 2019

Hewlett Packard Enterprise (HPE) today announced combined intelligence and composability offerings by integrating its artificial intelligence (AI) and machine learning-driven HPE Primera storage platform with the composability in HPE Synergy and HPE Composable Rack1, helping customers rapidly deliver new apps and innovations to propel their businesses forward. This unique combination allows customers to deliver services on an intelligent cloud platform, offering the flexibility to support any application and service level agreement (SLA) with cloud-like agility, extreme resiliency, and seamless scalability. Additionally, expanding on the recent introduction of HPE Synergy support for VMware Cloud Foundation (VCF), HPE Composable Rack, the HPE composable rack-scale solution, now supports VCF for hybrid cloud deployments.

Read More

Events