Develop Your Cloud Computing Skills in 2019

January is a time when most of us create personal and professional plans for the year. For me, this time of the year is about creating a plan for investing in myself so that I am better prepared for the innovation that is happening in cloud computing. Today, enterprises spend over $3 trillion in IT. According to Forrester’s "Predictions 2019: Cloud Computing" report, the global cloud market will exceed $200 billion in 2019. And distributed and cloud computing skills are the most sought-after skills globally, according to LinkedIn. As enterprises move their workloads to cloud, it's important to bridge the skills gap in the current workforce to drive adoption and successful deployments.

Spotlight

IonIdea

"IonIdea's vision is simply to become one of the most trusted companies in the technology and business process solutions industries. Our mission is to deliver excellent products, value-added solutions and high-quality services by bringing together passionate professionals, innovative technologies and the very best practices. "

OTHER ARTICLES
Neural Networks

Are Telcos Ready for a Quantum Leap?

Article | September 15, 2023

Quantum technologies present both an opportunity for telcos to solve difficult problems and provide new services and a security threat that could require extensive IT investment. Are Telcos Ready for a Quantum Leap? When Andrew Lord, Senior Manager, Optical Networks and Quantum Research at BT, first started presenting quantum technologies at customer events six or seven years ago, his was the graveyard shift, he says, entertaining attendees at the end of the day with talk of 'crazy quantum stuff.' "But that is no longer the case," says Lord. "Over the last two years, I've noticed a shift where I now speak before lunch, and customers actively seek us out." Two developments may be causing the shift: Customers’ growing awareness of the threats and opportunities that quantum computing presents, plus a recent spike in investment in quantum technology. In 2022, investors plowed $2.35 billion into quantum technology startups, which include companies in quantum computing, communications and sensing, according to McKinsey. The public sector has also been digging deep into its pockets. Last year, the United States added $1.8 billion to its previous spending on quantum technology, and the EU committed an extra $1.2 billion, the consultancy noted, while China made total investments of $15.3 billion. According to Luke Ibbetson, Head of Group R&D at Vodafone, quantum computing's promise lies in solving a probabilistic equation within a few hours. This task would take a classical computer a million years to accomplish. This breakthrough would enable telcos to address optimization problems related to network planning, optimization, and base station placement. The flip side is that a powerful quantum computer could also break the public-key cryptography that protects today’s IT systems from hackers. As a spokesperson at Deutsche Telekom remarks: “Telcos will have to react to the threat of quantum computers to communication security because their core business model is at risk, which is offering secure digital communications.” The idea of quantum computing posing a security threat is not new. In 1994, Peter Shor, a mathematician working at AT&T Bell Labs, showed how a quantum computer could solve the logarithms used to encrypt data. “His work simultaneously ignited multiple new lines of research in quantum computing, information science, and cryptography,” according to an article by the Massachusetts Institute of Technology, where Shor is currently working. Beyond The Lab What has changed nearly thirty years on is that quantum computing is creeping out of the lab. Sizeable obstacles to large-scale quantum computing, however, remain. Quantum computers are highly sensitive to interference from noise, temperature, movement or electromagnetic fields and, therefore, very difficult and expensive to build and operate, especially at scale: IBM’s latest quantum processor, for example, operates at a very low temperature of approximately 0.02 degrees Kelvin. When Deutsche Telekom’s T-Labs tested telco use cases, it found quantum computing coped well with small problem statements. “However, when the problem size was scaled to real-world problem sizes, the quality of the QComp solution degraded,” according to the spokesperson. The company is now awaiting the next generation of quantum computing platforms to redo the analyses. All of this means, for now, quantum computers are not large and powerful enough to crack Shor’s algorithm. The question is, when will someone succeed? The Global Risk Institute tracks the quantum threat timeline. In its latest annual report, the organization asked 40 quantum experts whether they thought it likely that within the next ten years, a quantum computer would break an encryption scheme like RSA-2048 in under 24 hours. Over half the respondents judged the event to be more than 5% likely, and almost a quarter considered it to be more than 50% likely. Any breakthrough will come from a relatively small number of actors. Today, governments and academic institutions are home to around half of the 163 projects accounted for worldwide by Global Quantum Intelligence, a research and analysis company, according to its CEO, André M. König, with big technology companies and specialized startups accounting for the rest. Q2K Nonetheless, the impact of quantum computing could be widespread, even if relatively few of them are built. The challenge of preparing for a post-quantum future is often called Q2K in reference to the Y2K bug. In the late 1990s, many (but not all) governmental organizations and companies spent millions of dollars on Y2K systems integration to ensure that IT programs written from the 1960s through the 1980s would be able to recognize dates after December 31, 1999, all while being uncertain of the scale or the impact of the risk if they didn’t. ‘Q2K’ differs in that there is no specific deadline, and the dangers of a major security breach are much clearer cut. However, it is similar in demanding a lot of work on aging systems. “Cryptography is used everywhere,” points out Lory Thorpe, IBM’s Director of Global Solutions and Offerings, Telecommunications. She adds, “Because telco systems have been built over periods of decades, people don’t actually know where cryptography is being used. So, if you start to look at the impact of public key cryptography and digital signatures being compromised, you start to look at how those two things impact open source, how that impacts the core network, the radio network, [and] OSS/BSS, network management, how the network management speaks to the network functions and so on.” This complexity is why some analysts recommend that telcos take action now. “You’re going to find tens of thousands of vulnerabilities that are critical and vulnerable to a quantum attack. So, do you have to worry about it today? Absolutely - even if it’s in 2035,” says König. “Anyone who has ever done [IT implementation projects], and anyone who’s ever worked in cybersecurity [knows], tens of thousands of vulnerabilities that are critical [requires] years and years and years of just traditional integration work. So, even if you’re skeptical about quantum, if you haven’t started today, it is almost too late already.” Don’t Panic! For the past two to three years, Vodafone has been preparing to migrate some of its cryptographic systems to be quantum-safe, according to Ibbetson. He believes there is no need to panic about this. However, telcos must start planning now. König said, "The telecoms industry as a whole is not moving as quickly as some other sectors, notably the banking, pharmaceutical, and automotive industries. In these sectors, post-quantum security planning often involves CEOs at a very strategic level." For this reason, Vodafone joined forces with IBM in September 2022 to establish the GSMA Post-Quantum Telco Network Taskforce. “Even though many industries are preparing to be able to defend against future quantum threats, we didn’t see anything happening particularly in in the telco space, and we wanted to make sure that it was a focus,” says Ibbetson. “Obviously it will turn into an IT-style transformation, but it’s starting now with understanding what it is we need to mobilize that.” AT&T has also been working to pinpoint what needs to be addressed. Last year, the company said it aims to be quantum-ready by 2025, in the sense that it will have done its due diligence and identified a clear path forward. Minding Your PQCs Companies across multiple sectors are looking to post-quantum cryptography (PQC) to secure their systems, which will use new algorithms that are much harder to crack than RSA. König contends that PQC needs to become “a standard component of companies’ agile defense posture” and believes the development of PQC systems by software and hardware companies will help keep upgrade costs under control. “From a financial point of view, vendors do a fantastic job bringing this to market and making it very accessible,” says König. Lord, who has been researching quantum technologies at BT for over a decade, is also confident that there is “going to be much more available technology.” As a result, even smaller telcos will be able to invest in securing their systems. “It doesn't need a big boy with lots of money [for] research to do something around PQC. There’s a lot of work going on to ratify the best of those solutions,” says Lord. There are several reasons why eyes are on software based PQC. Firstly, it can be used to secure data that was encrypted in the past, quantum computing advances will make vulnerable in the future. In addition, the quantum-based alternative to PQC for securing network traffic called quantum key distribution (QKD), comes with a huge drawback for wireless operators. QKD is hardware-based and uses quantum mechanics to prevent interception across optical fiber and satellite (i.e., free space optical) networks, making it secure, albeit expensive. But for reasons of physics, it does not work on mobile networks. Setting Standards Given the importance of PQC, a lot of effort is going into standardizing robust algorithms. The political weight of the US and the size of its technology industry mean that the US government’s National Institute of Standards and Technology (NIST) is playing a key role in the technical evaluation of post-quantum standardization algorithms and creating standards. NIST expects to publish the first set of post-quantum cryptography standards in 2024. In the meantime, Dustin Moody, a NIST mathematician, recommends (in answers emailed to inform) that companies “become familiar and do some testing with the algorithms being standardized, and how they will fit in your products and applications. Ensure that you are using current best-practice cryptographic algorithms and security strengths in your existing applications. Have somebody designated to be leading the effort to transition. QKD There is no absolute guarantee, however, that a quantum computer in the future won’t find a way to crack PQC. Therefore, institutions such as government agencies and banks remain interested in using QKD fiber and satellite networks to ensure the highest levels of security for data transmission. The European Commission, for example, is working with the 27 EU Member States and the European Space Agency (ESA) to design, develop and deploy a QKD-based European Quantum Communication Infrastructure (EuroQCI). It will be made up of fiber networks linking strategic sites at national and cross-border levels and a space segment based on satellites. EuroQCI will reinforce the protection of Europe’s governmental institutions, their data centers, hospitals, energy grids, and more,” according to the EU. Telecom operators are involved in some of the national programs, including Orange, which is coordinating France’s part of the program called FranceQCI (Quantum Communication Infrastructure). Separately, this month, Toshiba and Orange announced they had successfully demonstrated the viability of deploying QKD on existing commercial networks. Outside the EU, BT has already built and is now operating a commercial metro quantum-encryption network in London. “The London network has three quantum nodes, which are the bearers carrying the quantum traffic for all of the access ingress,” explains Lord. For example, a customer in London's Canary Wharf could link via the network to the nearest quantum-enabled BT exchange. From there, it joins a metro network, which carries the keys from multiple customers “in an aggregated cost-effective way to the egress points,” according to Lord. “It is not trivial because you can mess things up and [get] the wrong keys,” explains Lord. “You really have to be more careful about authentication and key management. And then it's all about how you engineer your quantum resources to handle bigger aggregation.” It also gives BT the opportunity to explore how to integrate quantum systems downstream into its whole network. “What I'm telling the quantum world is that they need to get into the real world because a system that uses quantum is still going to be 90%, non-quantum and all of the usual networking rules and engineering practices apply. You still need to know how to handle fiber. You still need to know how to provision a piece of equipment and integrate it into a network.” SK Telecom is also heavily involved in quantum-related research, with developments including QKD systems for the control and interworking of quantum cryptography communication networks. Japan is another important center of QKD research. A QKD network has existed in Tokyo since 2010, and in 2020, financial services company Nomura Securities Co., Ltd. tested the transmission of data across the Tokyo QKD network. As the EU’s project makes clear, satellite is an important part of the mix. Lord expects satellite-based QKD networks to come on stream as of 2025 and 2026, enabling the purchase of wholesale quantum keys from a dedicated satellite quantum provider. Back in 2017, China already used the satellite to make the first very long-distance transmission of data secured by QKD between Beijing and Vienna, a distance of 7,000km. Securing The Edge There are additional efforts to secure communications with edge devices. BT’s Lord, for example, sees a role for digital fingerprints for IoT devices, phones, cars and smart meters in the form of a physical unclonable function (PUF) silicon chip, which, because of random imperfections in its manufacture, cannot be copied. In the UK, BT is trialing a combination of QKD and PUF to secure the end-to-end journey of a driverless car. The connection to the roadside depends on standard radio with PUF authentication, while transmission from the roadside unit onward, as well as the overall control of the autonomous vehicle network, incorporate QKD, explains Lord. SK Telecom has developed what it describes as a quantum-enhanced cryptographic chip with Korea Computer & Systems (KCS) and ID Quantique. Telefónica Spain has partnered on the development of a quantum-safe 5G SIM card and has integrated quantum technology into its cloud service hosted in its virtual data centers. Given China’s heavy investment in quantum technologies, it is no surprise to see its telecom operators involved in the field. China Telecom, for example, recently invested three billion yuan ($434m) in quantum technology deployment, according to Reuters. Quantum in The Cloud Some of America's biggest technology companies are investing in quantum computing. Today, it is even possible to access quantum computing facilities via the cloud, albeit at on small scale. IBM's cloud access to quantum computers is free for the most basic level, rising to $1.60 per second for the next level. And it is just the beginning. America's big tech companies are racing to build quantum computers at scale. One measure of scale is the size of a quantum processor, which is measured in qubits. While a traditional computer stores information as a 0 or 1, a qubit can represent both 0 and 1 simultaneously. This unique property enables a quantum computer to explore multiple potential solutions to a problem simultaneously; and the greater the stability of its qubits, the more efficient it becomes. IBM has a long history in quantum research and development. In 1998, it unveiled what was then a ground-breaking 2-qubit computer. By 2022, it had produced a 433-qubit processor, and in 2023, it aims to produce a 1,121-qubit processor. Separately, this month, it announced the construction of its first quantum data center in Europe, which it expects to begin offering commercial services as of next year. Google is also firmly in the race to build a large-scale quantum computer. In 2019, a paper in Nature featured Google’s Sycamore processor and the speed with which it undertakes computational tasks. More recent work includes an experimental demonstration of it’s possible to reduce errors by increasing the number of qubits. Microsoft reckons that "a quantum machine capable of solving many of the hardest problems facing humanity will ultimately require at least 1 million stable qubits that can perform 1 quintillion operations while making at most a single error." To this end, it is working on what it calls a new type of qubit, a topological qubit. Amazon announced in 2021 an AWS Center for Quantum Computing on the Caltech campus to build a fault-tolerant quantum computer.

Read More
Software, Future Tech, Application Development Platform

Over the Waterfall to GitOps

Article | August 16, 2023

One of the first steps on the journey to cloud-native is transforming culture. This starts with embracing Agile methodology, followed by implementation of DevOps processes and eventually GitOps, as we explore in this extract from the recent e-book Mind the gap: bridging the skills divide on the journey to cloud native. Most CSPs agree that culture, including governance and skills, is the single biggest obstacle to adopting a cloud-native architecture. Traditional waterfall project management focuses on a linear progression, where one task or process needs to be completed before the next can start. This approach is time-consuming and costly, and it stifles innovation. It’s a major reason a CSP typically takes more than a year to develop a new service. Adopting Agile methodology is a completely new way of working that focuses on building cross-functional teams to speed innovation and service creation. This requires CSPs to seek individuals with the new project management skills and are adaptable and quick-thinking. Agile may not be suitable for every aspect of the business or for every project, but it is critical for moving to cloud-based, and eventually to cloud-native environments. Agile’s assumptions • Early, continuous delivery of software leads to happy customers • Changing requirements are always welcome, even in late development • Working software is delivered frequently • Business teams and developers work together every day • Projects are built around motivated and trusted individuals • Face-to-face is the best way to communicate • Working software is the principal measure of progress • Development is sustainable and constant • Attention to technical excellence and good design are required • Simplicity is essential • The best architectures emerge from self-organizing teams • Teams look for ways to be more effective and adjust accordingly There are lots of Agile approaches, but many CSPs use a model made popular by Spotify, which organizes teams into ‘squads,’ ‘tribes,’ ‘chapters,’ and ‘guilds.’ Vodafone Group follows this model and uses ‘very, very flat, non-hierarchical governance,’ according to Dr. Lester Thomas, Chief IT Systems Architect at Vodafone Group. “We’ve learned doing this in the digital space, but we’re trying to adopt that software approach right into our network.” Culture Eats Technology UScellular began adopting Agile methodology about five years ago, and the company is implementing cloud-native applications wherever they make sense. During its shift to the new way of working, cultural change has been the most difficult obstacle to overcome, significantly harder than technological change, according to Kevin Lowell, the company’s Chief People Officer and former Executive VP in charge of IT. The shift started with creating ‘a compelling why’ – in this case, improving how customers experience using UScellular services. The company replaced some waterfall processes with iterative Agile processes managed in scrums and implemented in sprints. The IT team also began meeting regularly with business stakeholders and educating them about how Agile works. Telecom Argentina is also embracing Agile. It is working with Red Hat to adopt a framework called Team Topologies to create a more efficient way of collaborating. The company is applying Team Topologies within its network division to create cross-functional teams that not only focus on the evolution and operation of technological platforms but also on creating and delivering services. From Agile to DevOps While Agile methodologies help to establish communication between IT teams and other stakeholders in the company, DevOps goes further by introducing an end-to-end software lifecycle that establishes a continuous flow of development, integration, testing, delivery and deployment. Google’s approach to DevOps, called Site Reliability Engineering (SRE), has been widely adopted in telecoms. It provides the foundation for the ODA Canvas, and it’s how Vodafone Group is implementing DevOps. Vodafone is a cloud-native pioneer. For the past several years, the company has been transforming into a platform provider, using what it calls a ‘telco-as-a-service’ or TaaS strategy. Vodafone is becoming a software company on its quest to become a techco, which involves hiring 7,000 software engineers to add to the existing 9,000 in the company. A key driver for embracing a cloud-native approach is “moving from our millions of human customers to billions of things,” says Thomas. Instead of offering just four primary services – fixed voice, broadband Internet, mobility and TV, he envisions using 5G network slicing to support thousands of IoT services per vertical market. “Unless we can drive this through software-driven approaches and automation, we’re not going to be successful,” he says. From DevOps to GitOps The problem with DevOps, however, is that most CSPs aren’t developing their software; they buy solutions from vendor partners. As Omdia’s James Crawshaw, Principal Analyst at Telco IT & Operations, notes in a research report, this makes it difficult for operators to create CI/CD pipelines that cut across organizational boundaries between CSPs and suppliers. To address this, CSPs “have adapted DevOps to their needs and created GitOps, which they use to take third-party applications and deploy on their own platforms,” Crawshaw explains. Philippe Ensarguet, Group CTO at Orange Business Services, recently explained that GitOps requires continuous integration and continuous operations or CI/CO. This means moving away from a prescriptive way of implementing operations to a declarative approach that supports full automation. What is GitOps? “If you rely mainly on the prescriptive approach, the day you want to move into production and scale up the number of applications you implement, you have to manage it purely with humans, and you hit the wall on scalability,” says Ensarguet. William Caban, Telco Chief Architect at Red Hat, sees GitOps as foundational to the concept of zero-touch, zero-wait and zero-trouble services, which will be orchestrated end-to-end in autonomous networks. “This is exactly what GitOps is about: event-driven, intent-based networks,” he says. “It becomes the operational model for architectures based on the ODA and autonomous networks.” CSPs must hire software and automation skills for GitOps. They also must reskill network experts, such as radio access network (RAN) engineers, to work in CI/CO teams so everyone uses common terminology. Some operators are going even further by creating centers of excellence (CoEs) where cross-functional teams from business, network and operations collaborate. “In GitOps, it is also necessary to codify team members’ knowledge, so that even as people move around or leave the company, the software development and operations lifecycle processes are not disrupted,” Caban says.

Read More
Software, Low-Code App Development, Application Development Platform

Empowering Industry 4.0 with Artificial Intelligence

Article | June 15, 2023

The next step in industrial technology is about robotics, computers and equipment becoming connected to the Internet of Things (IoT) and enhanced by machine learning algorithms. Industry 4.0 has the potential to be a powerful driver of economic growth, predicted to add between $500 billion- $1.5 trillion in value to the global economy between 2018 and 2022, according to a report by Capgemini.

Read More

How Artificial Intelligence Is Transforming Businesses

Article | February 12, 2020

Whilst there are many people that associate AI with sci-fi novels and films, its reputation as an antagonist to fictional dystopic worlds is now becoming a thing of the past, as the technology becomes more and more integrated into our everyday lives. AI technologies have become increasingly more present in our daily lives, not just with Alexa’s in the home, but also throughout businesses everywhere, disrupting a variety of different industries with often tremendous results. The technology has helped to streamline even the most mundane of tasks whilst having a breath-taking impact on a company’s efficiency and productivity

Read More

Spotlight

IonIdea

"IonIdea's vision is simply to become one of the most trusted companies in the technology and business process solutions industries. Our mission is to deliver excellent products, value-added solutions and high-quality services by bringing together passionate professionals, innovative technologies and the very best practices. "

Related News

General AI

VMware Extends Updates for Tanzu Platform and Spring Framework

VMware | November 08, 2023

VMware updates Spring framework and Tanzu platform with AI and machine learning. Enhancements include DORA metrics, Spring Boot 3.2, Spring Framework 6.1, and new machine learning and AI capabilities in Tanzu Data Services. These updates aim to streamline application development, reduce costs, and enhance security for modern operating models in the generative AI economy. On November 7, 2023, VMware introduced the latest updates to its Spring framework and Tanzu platform, aiming to empower software development teams to build higher-performing applications that leverage cutting-edge technology like AI and machine learning more efficiently and securely. The enhancements are designed to streamline application development, reduce costs, and improve security while accommodating modern operating models such as cloud containers and serverless environments. Purnima Padmanabhan, Senior Vice President and General Manager of Modern Apps and Management Business Group at VMware remarked that the velocity of innovation is what differentiates companies from the competition. The next-generation apps’ value will be elevated through new capabilities like ML and AI and scalability across any cloud. She added that they have been focusing on providing developers with exceptional tools and experiences for decades. As they mark the 20th anniversary of Spring, their latest enhancements and deep integration to the Tanzu Platform give application teams the ability to leverage more cutting-edge technologies like AI in new apps and take these apps to production safely, quickly, and more securely. The Tanzu platform now incorporates DORA metrics to track software delivery performance and greater transparency in the developer portal. The integration of VMware Tanzu Spring Runtime into Tanzu Application Platform enhances the Java application development experience. Tanzu Application Service 5.0 offers new features, such as a Postgres tile for DBaaS and AI support. Spring-related updates include Spring Boot 3.2 and Spring Framework 6.1, which enable GraalVM native images for better app runtime scalability, energy efficiency, and RAM consumption. Spring AI simplifies AI application development using the familiar Spring Framework, and Tanzu Spring Health Assessment helps organizations identify security issues in their Spring application portfolio. Tanzu Data Services enhancements add machine learning and AI capabilities to data services. Tanzu Intelligence Services now include VMware Tanzu CloudHealth for achieving cloud sustainability goals, VMware Tanzu Guardrails for continuous compliance, and VMware Tanzu Application Catalog for open-source content security. The Tanzu platform will continue to integrate into a common control plane, VMware Tanzu Hub, offering a refreshed user experience, integrated observability, migration planning and assessments, cost reports, and enhanced Intelligent Assist capabilities. These updates aim to help organizations develop, operate, and optimize modern applications more effectively in the generative AI economy.

Read More

AI Tech

Silobreaker Releases AI for Swift and Precise Threat Intelligence

Silobreaker | November 07, 2023

Silobreaker, a leading security and threat intelligence technology company, has announced the launch of its new AI tool, Silobreaker AI. This tool is designed to assist threat intelligence teams in collecting, analyzing, and reporting on intelligence requirements, thereby enabling faster, intelligence-led decision-making within organizations. The AI tool, which will be integrated into the Silobreaker intelligence platform, can summarize and extract key information from Silobreaker’s own analyst content and generate topical threat reports. These reports can then be used by stakeholders to make informed decisions. Per Lindh, CTO of Silobreaker, described the tool as a 'cheat code' for threat intelligence teams, providing faster insights into threats and enabling decisive action to reduce risks. The tool also augments Silobreaker’s range of threat intelligence capabilities, adding computer-aided learning and automation techniques to streamline the collection, analysis, and dissemination of open-source intelligence data. While the introduction of Silobreaker AI promises to revolutionize threat intelligence, it's important to consider potential drawbacks. The reliance on AI could potentially lead to overdependence, reducing human oversight and possibly missing nuanced threats that require human judgement. Additionally, the effectiveness of the tool is dependent on the quality of the data it's trained on, which could limit its accuracy if not properly managed. On the other hand, the benefits are substantial. Silobreaker AI can accelerate the production of high-quality intelligence reports, enabling faster, more informed decision-making. It provides threat intelligence teams with faster insights into threats, allowing for more decisive action to reduce risks. The tool also augments Silobreaker’s range of threat intelligence capabilities, adding computer-aided learning and automation techniques to streamline the collection, analysis, and dissemination of open-source intelligence data. This could significantly improve efficiency and productivity in threat intelligence teams. About Silobreaker Silobreaker is a software-as-a-Service (SaaS) platform that specializes in threat intelligence. It streamlines the intelligence cycle, from managing cyber, physical, and geopolitical PIRs to collecting, processing, analyzing, and disseminating structured and unstructured data from various web sources. The platform aids intelligence teams in identifying and prioritizing threats, enabling decision-makers to mitigate risk, protect revenue, and drive business results swiftly. Silobreaker caters to a diverse clientele, including corporate, government, military, and financial services sectors, addressing various use-cases across cyber and corporate security, competitive intelligence, incident management, market intelligence, risk analysis, asset management, and general OSINT.

Read More

Software

BMC Software Announces AIOps in BMC Helix for AI-Optimized IT Ops

BMC Software | November 06, 2023

BMC Software's new AIOps capabilities in BMC Helix Operations Management use AI for quick IT issue resolution. The solution boosts IT operations in hybrid, multi-cloud environments, enhancing visibility and service performance. New features like service blueprints, causal AI-powered explainability, and AIOps situation fingerprinting expedite incident resolution and risk recovery. BMC Software, a global leader in IT solutions, announces new AIOps capabilities for its BMC Helix Operations Management solution using the BMC HelixGPT capability. The solution uses advanced AI to find problems' root causes more quickly. It changes the way IT works by adding dynamic service modeling, situation explainability, and deep container auto-detection to better understand containerized environments. As businesses grapple with complex hybrid, multi-cloud environments and increasing data volume and complexity, the need for advanced AI and machine learning to drive visibility, observability, and optimum business service performance is paramount. Nancy Gohring, research director for IDC's Enterprise System Management, Observability and AIOps program, emphasized the importance of modernizing IT operations in line with the adoption of hybrid and cloud-native technologies. AIOps capabilities that leverage AI to pinpoint problem causes, guide users to the correct response, and predict potential future issues are key to ensuring service delivery aligns with business outcomes. The BMC Helix Operations Management solution combines advanced causal AI to identify issue root causes, predictive AI for proactive problem identification and resolution, and generative AI for automating event summaries and best action recommendations for complex problems. These innovations enable IT operations to deliver higher service availability and resilience to businesses, driving efficient operational performance with greater tool silo visibility and superior AI-driven insights for significantly improved problem identification and repair times. The new BMC Helix Operations Management innovations include out-of-the-box service blueprints, situation explainability powered by causal AI, and AIOps situation fingerprinting powered by AI, GPT, and NLP. These features ensure accurate service models in ever-changing IT environments, swift incident resolution, and faster recovery from service outages and other potential risks. While the new AIOps capabilities in BMC's Helix Operations Management solution offer a host of benefits, they also present potential challenges. The complexity of AI systems can lead to difficulties in understanding and controlling their operations, which could pose challenges in troubleshooting and rectifying issues. Additionally, the heavy reliance on AI might reduce the level of human oversight in IT operations, which could be risky in certain scenarios. The effectiveness of the solution is also heavily dependent on the quality and quantity of data it receives, which might not always be optimal in real-world scenarios. On the brighter side, the benefits of this solution are substantial. The use of advanced AI capabilities allows for swift identification and resolution of IT issues, greatly improving operational efficiency. The solution's ability to enhance IT operations in complex hybrid, multi-cloud environments is a significant advantage, as it provides much-needed visibility and service performance. The new features, including out-of-the-box service blueprints, causal AI-powered explainability, and AIOps situation fingerprinting, ensure swift incident resolution and faster recovery from potential risks. These innovations lead to higher service availability and resilience, which are crucial for businesses in today's digital age. Overall, despite some potential challenges, the BMC Helix Operations Management solution's new AIOps capabilities present a promising advancement in the field of IT operations management.

Read More

General AI

VMware Extends Updates for Tanzu Platform and Spring Framework

VMware | November 08, 2023

VMware updates Spring framework and Tanzu platform with AI and machine learning. Enhancements include DORA metrics, Spring Boot 3.2, Spring Framework 6.1, and new machine learning and AI capabilities in Tanzu Data Services. These updates aim to streamline application development, reduce costs, and enhance security for modern operating models in the generative AI economy. On November 7, 2023, VMware introduced the latest updates to its Spring framework and Tanzu platform, aiming to empower software development teams to build higher-performing applications that leverage cutting-edge technology like AI and machine learning more efficiently and securely. The enhancements are designed to streamline application development, reduce costs, and improve security while accommodating modern operating models such as cloud containers and serverless environments. Purnima Padmanabhan, Senior Vice President and General Manager of Modern Apps and Management Business Group at VMware remarked that the velocity of innovation is what differentiates companies from the competition. The next-generation apps’ value will be elevated through new capabilities like ML and AI and scalability across any cloud. She added that they have been focusing on providing developers with exceptional tools and experiences for decades. As they mark the 20th anniversary of Spring, their latest enhancements and deep integration to the Tanzu Platform give application teams the ability to leverage more cutting-edge technologies like AI in new apps and take these apps to production safely, quickly, and more securely. The Tanzu platform now incorporates DORA metrics to track software delivery performance and greater transparency in the developer portal. The integration of VMware Tanzu Spring Runtime into Tanzu Application Platform enhances the Java application development experience. Tanzu Application Service 5.0 offers new features, such as a Postgres tile for DBaaS and AI support. Spring-related updates include Spring Boot 3.2 and Spring Framework 6.1, which enable GraalVM native images for better app runtime scalability, energy efficiency, and RAM consumption. Spring AI simplifies AI application development using the familiar Spring Framework, and Tanzu Spring Health Assessment helps organizations identify security issues in their Spring application portfolio. Tanzu Data Services enhancements add machine learning and AI capabilities to data services. Tanzu Intelligence Services now include VMware Tanzu CloudHealth for achieving cloud sustainability goals, VMware Tanzu Guardrails for continuous compliance, and VMware Tanzu Application Catalog for open-source content security. The Tanzu platform will continue to integrate into a common control plane, VMware Tanzu Hub, offering a refreshed user experience, integrated observability, migration planning and assessments, cost reports, and enhanced Intelligent Assist capabilities. These updates aim to help organizations develop, operate, and optimize modern applications more effectively in the generative AI economy.

Read More

AI Tech

Silobreaker Releases AI for Swift and Precise Threat Intelligence

Silobreaker | November 07, 2023

Silobreaker, a leading security and threat intelligence technology company, has announced the launch of its new AI tool, Silobreaker AI. This tool is designed to assist threat intelligence teams in collecting, analyzing, and reporting on intelligence requirements, thereby enabling faster, intelligence-led decision-making within organizations. The AI tool, which will be integrated into the Silobreaker intelligence platform, can summarize and extract key information from Silobreaker’s own analyst content and generate topical threat reports. These reports can then be used by stakeholders to make informed decisions. Per Lindh, CTO of Silobreaker, described the tool as a 'cheat code' for threat intelligence teams, providing faster insights into threats and enabling decisive action to reduce risks. The tool also augments Silobreaker’s range of threat intelligence capabilities, adding computer-aided learning and automation techniques to streamline the collection, analysis, and dissemination of open-source intelligence data. While the introduction of Silobreaker AI promises to revolutionize threat intelligence, it's important to consider potential drawbacks. The reliance on AI could potentially lead to overdependence, reducing human oversight and possibly missing nuanced threats that require human judgement. Additionally, the effectiveness of the tool is dependent on the quality of the data it's trained on, which could limit its accuracy if not properly managed. On the other hand, the benefits are substantial. Silobreaker AI can accelerate the production of high-quality intelligence reports, enabling faster, more informed decision-making. It provides threat intelligence teams with faster insights into threats, allowing for more decisive action to reduce risks. The tool also augments Silobreaker’s range of threat intelligence capabilities, adding computer-aided learning and automation techniques to streamline the collection, analysis, and dissemination of open-source intelligence data. This could significantly improve efficiency and productivity in threat intelligence teams. About Silobreaker Silobreaker is a software-as-a-Service (SaaS) platform that specializes in threat intelligence. It streamlines the intelligence cycle, from managing cyber, physical, and geopolitical PIRs to collecting, processing, analyzing, and disseminating structured and unstructured data from various web sources. The platform aids intelligence teams in identifying and prioritizing threats, enabling decision-makers to mitigate risk, protect revenue, and drive business results swiftly. Silobreaker caters to a diverse clientele, including corporate, government, military, and financial services sectors, addressing various use-cases across cyber and corporate security, competitive intelligence, incident management, market intelligence, risk analysis, asset management, and general OSINT.

Read More

Software

BMC Software Announces AIOps in BMC Helix for AI-Optimized IT Ops

BMC Software | November 06, 2023

BMC Software's new AIOps capabilities in BMC Helix Operations Management use AI for quick IT issue resolution. The solution boosts IT operations in hybrid, multi-cloud environments, enhancing visibility and service performance. New features like service blueprints, causal AI-powered explainability, and AIOps situation fingerprinting expedite incident resolution and risk recovery. BMC Software, a global leader in IT solutions, announces new AIOps capabilities for its BMC Helix Operations Management solution using the BMC HelixGPT capability. The solution uses advanced AI to find problems' root causes more quickly. It changes the way IT works by adding dynamic service modeling, situation explainability, and deep container auto-detection to better understand containerized environments. As businesses grapple with complex hybrid, multi-cloud environments and increasing data volume and complexity, the need for advanced AI and machine learning to drive visibility, observability, and optimum business service performance is paramount. Nancy Gohring, research director for IDC's Enterprise System Management, Observability and AIOps program, emphasized the importance of modernizing IT operations in line with the adoption of hybrid and cloud-native technologies. AIOps capabilities that leverage AI to pinpoint problem causes, guide users to the correct response, and predict potential future issues are key to ensuring service delivery aligns with business outcomes. The BMC Helix Operations Management solution combines advanced causal AI to identify issue root causes, predictive AI for proactive problem identification and resolution, and generative AI for automating event summaries and best action recommendations for complex problems. These innovations enable IT operations to deliver higher service availability and resilience to businesses, driving efficient operational performance with greater tool silo visibility and superior AI-driven insights for significantly improved problem identification and repair times. The new BMC Helix Operations Management innovations include out-of-the-box service blueprints, situation explainability powered by causal AI, and AIOps situation fingerprinting powered by AI, GPT, and NLP. These features ensure accurate service models in ever-changing IT environments, swift incident resolution, and faster recovery from service outages and other potential risks. While the new AIOps capabilities in BMC's Helix Operations Management solution offer a host of benefits, they also present potential challenges. The complexity of AI systems can lead to difficulties in understanding and controlling their operations, which could pose challenges in troubleshooting and rectifying issues. Additionally, the heavy reliance on AI might reduce the level of human oversight in IT operations, which could be risky in certain scenarios. The effectiveness of the solution is also heavily dependent on the quality and quantity of data it receives, which might not always be optimal in real-world scenarios. On the brighter side, the benefits of this solution are substantial. The use of advanced AI capabilities allows for swift identification and resolution of IT issues, greatly improving operational efficiency. The solution's ability to enhance IT operations in complex hybrid, multi-cloud environments is a significant advantage, as it provides much-needed visibility and service performance. The new features, including out-of-the-box service blueprints, causal AI-powered explainability, and AIOps situation fingerprinting, ensure swift incident resolution and faster recovery from potential risks. These innovations lead to higher service availability and resilience, which are crucial for businesses in today's digital age. Overall, despite some potential challenges, the BMC Helix Operations Management solution's new AIOps capabilities present a promising advancement in the field of IT operations management.

Read More

Events