Top 10 IoT Application Development Trends to Watch in 2023

Top 10 IoT Application Development Trends to Watch in 2023
Explore the top IoT app development trends of 2023 for sustainable business growth. Keep up with the latest technological advancements in the IoT development space to drive long-term market growth.


Table of Contents

IoT Application Development for Business Growth
Comprehensive List of IoT Application Development Trends
Final Say on the Future of Internet of Things App Development

Keeping up with the new things in technology, including the new IoT technology and IoT innovations, has become crucial and hence businesses are looking out for the top IoT application development trends to optimize their development workflows. Data is an important application development tool when it comes to creating a refined user experience, and IoT sensors and IoT-enabled applications developed correctly can help create meaningful data. This data helps businesses streamline their operations and consumers improve their quality of life. Likewise, to effectively manage and control IoT devices and smart ecosystems, developing IoT applications is a must.
 
Mobile IoT applications come in handy to perform a wide range of functions efficiently on the go. A good IoT development platform will not only reduce development time but also help get product into the hands of customers early. This helps reduce both direct and opportunity costs.
 

IoT Application Development for Business Growth

IoT is crucial for business growth as it enables real-time data collection and analysis, automation, constant connectivity, and insights into a company's systems. It streamlines processes across industries, boosting productivity and supporting business expansion into new lines of business. Additionally, IoT reduces expenses, enabling companies to invest more in their core values. By integrating IoT into their operations, businesses can optimize processes, reduce costs, and enhance customer experiences through smart devices that enable remote monitoring, predictive maintenance, and data-driven decision-making. IoT applications also facilitate personalized marketing, empowering companies to stay competitive, expand their market reach, and foster innovation for sustainable business growth.
 

Comprehensive List of IoT Application Development Trends

Keeping up with app industry trends is important for the growth of IoT app development and finding opportunities to save costs by leveraging the latest innovations in the market and producing a superior user experience. Here is a list of the top IoT application development trends for 2023:
 

1.         Digital Twins

Digital Twins are one of the top IoT application development trends in 2023 and enable seamless virtual interactions for organizations, services, and customers. It creates virtual replicas for analysis and enhancement, providing real-time insights and boosting productivity. Industries like Urban Planning and Automotive leverage digital twins to improve operations and simulate risk-free environments. With IoT adoption soaring, digital twins integrate with IoT systems to use real-time data, ushering in a tech revolution that bridges the digital-human divide. They empower the creation of virtual towns, communities, and infrastructure.
 

2.         AI and IoT

AI will help make sense of the vast IoT data and boost its value for businesses. Major cloud vendors like Amazon, Microsoft, and Google will compete based on AI capabilities. Low-code and no-code AutoML will rise for IoT analytics, along with a focus on security in manufacturing, big data analytics, and machine learning. The integration of AI with IoT is a top trend for 2023. Feeding IoT data to an AI model to help analyze the data trends and predict future demands is one of the most promising trends in application development that is sure to shape IoT app development in the future.
 

3.         Novel Sensor Technology

The combination of IoT with sensors enhances their impact and use cases, with Gartner predicting further advancements in sensor technology and algorithms. One example is Henkel's sensor experience kit, tailored for IoT engineering across industries, enabling engineers to explore printed electronics for IoT sensor solutions. This technology allows for remote monitoring, improved efficiency tracking, and sustainable resource use. Novel sensor technology is a top IoT trend in 2023. This novel sensor technology has the potential to disrupt the global supply chain industry by reducing lead times and efficiently managing inventory.
 

4.         5G Automotive Services

The combination of 5G and IoT will greatly impact the automotive sector, which represents more than half of the 5G and IoT market. 5G automotive services will be a major IoT development trend in 2023. According to Gartner's research, connected cars will increase significantly, reaching 94% in 2028. Electric vehicles will benefit from improved connectivity, leading to better control and performance. 5G will also support EV infrastructure, like recharge stations and efficient route finding. However, some aspects of the automotive business's transformation are still unclear, as customers now seek more than just transportation from their cars.
 

5.         Innovation at the Chip Level for IoT Devices

Specialized chips are emerging for IoT systems, driven by the acceptance of edge computing and AI. For instance, MediaTek's Genio 700 IoT chipset targets various applications like industrial, smart home, and smart retail and is set to be available by Q2 2023. IoT advancements in chips, connectivity, security, and AI are reducing costs and improving devices. In 2023, innovation at the chip level for IoT devices will be a top trend to watch. Smaller and more efficient processors and wireless components enable connected devices to enter key markets like consumer appliances, cars, manufacturing, and healthcare. Enhanced networks lead to more reliable connectivity, creating demand for interconnected devices.
 

6.         Healthcare and IoT

The Internet of Healthcare Things is a top IoT trend for 2023, as projected by Forbes. The market for IoT-enabled health devices will reach $267 billion by 2023. The combination of IoT and cutting-edge technologies in healthcare may lead to significant advancements. Post-COVID-19, cloud, IoT, and AI technologies have been used to explore new use cases. The global IoMT market is expected to reach $187.60 billion by 2028, a substantial increase from its $41.17 billion worth in 2020. More connectivity means better healthcare for patients. This creates a tremendous opportunity to build IoT-enabled apps for the healthcare sector to improve overall patient care and experience.
 
Some of the significant IoT-enabled apps in healthcare include:
 
  • Smart wearables, implants, and ingestible electronics that can monitor patient health and safety.
  • Remote physiological monitoring can reduce hospital readmissions and improve care quality.
  • Hospital operations management can optimize workflow, resource allocation, and patient satisfaction.
  • Glucose monitoring can help diabetic patients control their blood sugar levels.
  • Connected inhaler that can track medication usage and alert patients of potential triggers.
 

7.         IoT Security

Ransomware attacks pose a significant risk as IoT devices become more prevalent, leading to potential disasters. To address this, a robust security system will be implemented to safeguard consumer IoT devices and prevent cyberattacks effectively. IoT Security is a crucial 2023 trend due to the rapid growth of Internet of Things applications in various sectors.
 
To prevent ransomware attacks on IoT devices, take these steps:

  • Update and patch systems to address vulnerabilities.
  • Employ secure authentication strategies.
  • Enforce the principle of least privilege.
  • Regularly back up files.
  • Ensure strong network protection.
  • Monitor network traffic.
  • Prioritize security over connectivity.
  • Disable unneeded services or ports on devices.
  • Keep all IoT devices, software, and hardware updated.
 
By following these steps, businesses can protect IoT devices against ransomware attacks.
 

8.         New Interfaces and Experiences for Users

One of the prominent Internet of Things trends in 2023 is the rise of new interfaces and experiences for users. This growth is driven by innovative interfaces that offer improved user experiences and present fresh challenges. Notably, virtual assistants have gained popularity, with various digital assistants being integrated into homes. Devices like Amazon Alexa are excellent examples of how NLP technology plays a crucial role in enhancing user-device interactions. These advancements bring opportunities and challenges, with a focus on security, privacy, and standardization. One such IoT standard called Matter, formerly known as Project CHIP, provides a seamlessly connected user experience. The future promises an interconnected world where smart devices seamlessly cater to consumer needs.
 

9.         5G for 24x7 Connectivity

5G enables transformative IoT applications with numerous innovative use cases. It offers low latency and expanded networks, reaching 10 times more devices per square kilometer than 4G, and speeds ten times faster than 4G. This advancement facilitates remote patient treatment, media-rich presentations from home, autonomous driving, smart grids to eliminate blackouts, connected sensors for parking and safety, and remote patient health monitoring. A significant IoT trend is the use of 5G for 24x7 connectivity in 2023. Enterprises benefit from network slicing for private networks, enhancing data management, integrity, and access. The rollout of 5G networks drives productivity and efficiency improvements across industries.
 

10.      The Power of Edge Computing

IoT edge computing processes data at the edge of the network or near devices, solving latency issues, improving network bandwidth, and enabling faster responses and local decision-making. It enhances safety, operational efficiency, and the user experience. The power of edge computing has quickly become one of the top IoT application development trends in 2023. IoT edge computing can function with a cloud or a centralized data center and even offline when a network connection is lost. Edge computing allows IoT devices to generate, store, and analyze data at the source, enabling autonomous operation and direct communication between devices without centralized processing. The main advantage is improved IoT performance, as edge computing maintains scalability and flexibility while significantly reducing latency compared to traditional cloud environments.
 

Final Say on the Future of Internet of Things App Development

In conclusion, the future of IoT app development looks bright, with many exciting trends to watch in 2023. Digital twins will allow businesses to bridge the gap between the real and virtual worlds, while AI and IoT will help businesses extract more value out of their ever-growing volumes of data. Novel sensor technology will continue to impact all components of the IoT model, while 5G automotive services will open the door for more devices and data traffic.
 
Innovation at the chip level for IoT devices will continue to drive progress, while healthcare and IoT will see increased adoption. Security remains a top concern for IoT devices and applications, while new interfaces and experiences for users, such as Voice User Interface (VUI), will become a reality. Edge computing will make it easier for businesses to process data faster and closer to the points of action, while blockchain in the IoT will see increased proliferation. With all these trends and more, there is no shortage of innovation in the field of IoT app development. Developer communities keen on keeping up with the latest trends rely heavily on attending some of the top app development conferences and taking up app development certifications for a better future.

Spotlight

iXsystems

As the world leader in enterprise Open Source server and storage products, iXsystems builds and qualifies its hardware solutions for companies that leverage Open Source technology in their core IT business infrastructure. The driving principle behind everything we do at iXsystems is the belief that Open Source technology has the power to change the world through its process of open and collaborative innovation. Building products and services around this open ecosystem enables us to be successful and helps us empower our clients to do the same.

OTHER ARTICLES
Software, Low-Code App Development, Application Development Platform

Are Telcos Ready for a Quantum Leap?

Article | July 3, 2023

Quantum technologies present both an opportunity for telcos to solve difficult problems and provide new services and a security threat that could require extensive IT investment. Are Telcos Ready for a Quantum Leap? When Andrew Lord, Senior Manager, Optical Networks and Quantum Research at BT, first started presenting quantum technologies at customer events six or seven years ago, his was the graveyard shift, he says, entertaining attendees at the end of the day with talk of 'crazy quantum stuff.' "But that is no longer the case," says Lord. "Over the last two years, I've noticed a shift where I now speak before lunch, and customers actively seek us out." Two developments may be causing the shift: Customers’ growing awareness of the threats and opportunities that quantum computing presents, plus a recent spike in investment in quantum technology. In 2022, investors plowed $2.35 billion into quantum technology startups, which include companies in quantum computing, communications and sensing, according to McKinsey. The public sector has also been digging deep into its pockets. Last year, the United States added $1.8 billion to its previous spending on quantum technology, and the EU committed an extra $1.2 billion, the consultancy noted, while China made total investments of $15.3 billion. According to Luke Ibbetson, Head of Group R&D at Vodafone, quantum computing's promise lies in solving a probabilistic equation within a few hours. This task would take a classical computer a million years to accomplish. This breakthrough would enable telcos to address optimization problems related to network planning, optimization, and base station placement. The flip side is that a powerful quantum computer could also break the public-key cryptography that protects today’s IT systems from hackers. As a spokesperson at Deutsche Telekom remarks: “Telcos will have to react to the threat of quantum computers to communication security because their core business model is at risk, which is offering secure digital communications.” The idea of quantum computing posing a security threat is not new. In 1994, Peter Shor, a mathematician working at AT&T Bell Labs, showed how a quantum computer could solve the logarithms used to encrypt data. “His work simultaneously ignited multiple new lines of research in quantum computing, information science, and cryptography,” according to an article by the Massachusetts Institute of Technology, where Shor is currently working. Beyond The Lab What has changed nearly thirty years on is that quantum computing is creeping out of the lab. Sizeable obstacles to large-scale quantum computing, however, remain. Quantum computers are highly sensitive to interference from noise, temperature, movement or electromagnetic fields and, therefore, very difficult and expensive to build and operate, especially at scale: IBM’s latest quantum processor, for example, operates at a very low temperature of approximately 0.02 degrees Kelvin. When Deutsche Telekom’s T-Labs tested telco use cases, it found quantum computing coped well with small problem statements. “However, when the problem size was scaled to real-world problem sizes, the quality of the QComp solution degraded,” according to the spokesperson. The company is now awaiting the next generation of quantum computing platforms to redo the analyses. All of this means, for now, quantum computers are not large and powerful enough to crack Shor’s algorithm. The question is, when will someone succeed? The Global Risk Institute tracks the quantum threat timeline. In its latest annual report, the organization asked 40 quantum experts whether they thought it likely that within the next ten years, a quantum computer would break an encryption scheme like RSA-2048 in under 24 hours. Over half the respondents judged the event to be more than 5% likely, and almost a quarter considered it to be more than 50% likely. Any breakthrough will come from a relatively small number of actors. Today, governments and academic institutions are home to around half of the 163 projects accounted for worldwide by Global Quantum Intelligence, a research and analysis company, according to its CEO, André M. König, with big technology companies and specialized startups accounting for the rest. Q2K Nonetheless, the impact of quantum computing could be widespread, even if relatively few of them are built. The challenge of preparing for a post-quantum future is often called Q2K in reference to the Y2K bug. In the late 1990s, many (but not all) governmental organizations and companies spent millions of dollars on Y2K systems integration to ensure that IT programs written from the 1960s through the 1980s would be able to recognize dates after December 31, 1999, all while being uncertain of the scale or the impact of the risk if they didn’t. ‘Q2K’ differs in that there is no specific deadline, and the dangers of a major security breach are much clearer cut. However, it is similar in demanding a lot of work on aging systems. “Cryptography is used everywhere,” points out Lory Thorpe, IBM’s Director of Global Solutions and Offerings, Telecommunications. She adds, “Because telco systems have been built over periods of decades, people don’t actually know where cryptography is being used. So, if you start to look at the impact of public key cryptography and digital signatures being compromised, you start to look at how those two things impact open source, how that impacts the core network, the radio network, [and] OSS/BSS, network management, how the network management speaks to the network functions and so on.” This complexity is why some analysts recommend that telcos take action now. “You’re going to find tens of thousands of vulnerabilities that are critical and vulnerable to a quantum attack. So, do you have to worry about it today? Absolutely - even if it’s in 2035,” says König. “Anyone who has ever done [IT implementation projects], and anyone who’s ever worked in cybersecurity [knows], tens of thousands of vulnerabilities that are critical [requires] years and years and years of just traditional integration work. So, even if you’re skeptical about quantum, if you haven’t started today, it is almost too late already.” Don’t Panic! For the past two to three years, Vodafone has been preparing to migrate some of its cryptographic systems to be quantum-safe, according to Ibbetson. He believes there is no need to panic about this. However, telcos must start planning now. König said, "The telecoms industry as a whole is not moving as quickly as some other sectors, notably the banking, pharmaceutical, and automotive industries. In these sectors, post-quantum security planning often involves CEOs at a very strategic level." For this reason, Vodafone joined forces with IBM in September 2022 to establish the GSMA Post-Quantum Telco Network Taskforce. “Even though many industries are preparing to be able to defend against future quantum threats, we didn’t see anything happening particularly in in the telco space, and we wanted to make sure that it was a focus,” says Ibbetson. “Obviously it will turn into an IT-style transformation, but it’s starting now with understanding what it is we need to mobilize that.” AT&T has also been working to pinpoint what needs to be addressed. Last year, the company said it aims to be quantum-ready by 2025, in the sense that it will have done its due diligence and identified a clear path forward. Minding Your PQCs Companies across multiple sectors are looking to post-quantum cryptography (PQC) to secure their systems, which will use new algorithms that are much harder to crack than RSA. König contends that PQC needs to become “a standard component of companies’ agile defense posture” and believes the development of PQC systems by software and hardware companies will help keep upgrade costs under control. “From a financial point of view, vendors do a fantastic job bringing this to market and making it very accessible,” says König. Lord, who has been researching quantum technologies at BT for over a decade, is also confident that there is “going to be much more available technology.” As a result, even smaller telcos will be able to invest in securing their systems. “It doesn't need a big boy with lots of money [for] research to do something around PQC. There’s a lot of work going on to ratify the best of those solutions,” says Lord. There are several reasons why eyes are on software based PQC. Firstly, it can be used to secure data that was encrypted in the past, quantum computing advances will make vulnerable in the future. In addition, the quantum-based alternative to PQC for securing network traffic called quantum key distribution (QKD), comes with a huge drawback for wireless operators. QKD is hardware-based and uses quantum mechanics to prevent interception across optical fiber and satellite (i.e., free space optical) networks, making it secure, albeit expensive. But for reasons of physics, it does not work on mobile networks. Setting Standards Given the importance of PQC, a lot of effort is going into standardizing robust algorithms. The political weight of the US and the size of its technology industry mean that the US government’s National Institute of Standards and Technology (NIST) is playing a key role in the technical evaluation of post-quantum standardization algorithms and creating standards. NIST expects to publish the first set of post-quantum cryptography standards in 2024. In the meantime, Dustin Moody, a NIST mathematician, recommends (in answers emailed to inform) that companies “become familiar and do some testing with the algorithms being standardized, and how they will fit in your products and applications. Ensure that you are using current best-practice cryptographic algorithms and security strengths in your existing applications. Have somebody designated to be leading the effort to transition. QKD There is no absolute guarantee, however, that a quantum computer in the future won’t find a way to crack PQC. Therefore, institutions such as government agencies and banks remain interested in using QKD fiber and satellite networks to ensure the highest levels of security for data transmission. The European Commission, for example, is working with the 27 EU Member States and the European Space Agency (ESA) to design, develop and deploy a QKD-based European Quantum Communication Infrastructure (EuroQCI). It will be made up of fiber networks linking strategic sites at national and cross-border levels and a space segment based on satellites. EuroQCI will reinforce the protection of Europe’s governmental institutions, their data centers, hospitals, energy grids, and more,” according to the EU. Telecom operators are involved in some of the national programs, including Orange, which is coordinating France’s part of the program called FranceQCI (Quantum Communication Infrastructure). Separately, this month, Toshiba and Orange announced they had successfully demonstrated the viability of deploying QKD on existing commercial networks. Outside the EU, BT has already built and is now operating a commercial metro quantum-encryption network in London. “The London network has three quantum nodes, which are the bearers carrying the quantum traffic for all of the access ingress,” explains Lord. For example, a customer in London's Canary Wharf could link via the network to the nearest quantum-enabled BT exchange. From there, it joins a metro network, which carries the keys from multiple customers “in an aggregated cost-effective way to the egress points,” according to Lord. “It is not trivial because you can mess things up and [get] the wrong keys,” explains Lord. “You really have to be more careful about authentication and key management. And then it's all about how you engineer your quantum resources to handle bigger aggregation.” It also gives BT the opportunity to explore how to integrate quantum systems downstream into its whole network. “What I'm telling the quantum world is that they need to get into the real world because a system that uses quantum is still going to be 90%, non-quantum and all of the usual networking rules and engineering practices apply. You still need to know how to handle fiber. You still need to know how to provision a piece of equipment and integrate it into a network.” SK Telecom is also heavily involved in quantum-related research, with developments including QKD systems for the control and interworking of quantum cryptography communication networks. Japan is another important center of QKD research. A QKD network has existed in Tokyo since 2010, and in 2020, financial services company Nomura Securities Co., Ltd. tested the transmission of data across the Tokyo QKD network. As the EU’s project makes clear, satellite is an important part of the mix. Lord expects satellite-based QKD networks to come on stream as of 2025 and 2026, enabling the purchase of wholesale quantum keys from a dedicated satellite quantum provider. Back in 2017, China already used the satellite to make the first very long-distance transmission of data secured by QKD between Beijing and Vienna, a distance of 7,000km. Securing The Edge There are additional efforts to secure communications with edge devices. BT’s Lord, for example, sees a role for digital fingerprints for IoT devices, phones, cars and smart meters in the form of a physical unclonable function (PUF) silicon chip, which, because of random imperfections in its manufacture, cannot be copied. In the UK, BT is trialing a combination of QKD and PUF to secure the end-to-end journey of a driverless car. The connection to the roadside depends on standard radio with PUF authentication, while transmission from the roadside unit onward, as well as the overall control of the autonomous vehicle network, incorporate QKD, explains Lord. SK Telecom has developed what it describes as a quantum-enhanced cryptographic chip with Korea Computer & Systems (KCS) and ID Quantique. Telefónica Spain has partnered on the development of a quantum-safe 5G SIM card and has integrated quantum technology into its cloud service hosted in its virtual data centers. Given China’s heavy investment in quantum technologies, it is no surprise to see its telecom operators involved in the field. China Telecom, for example, recently invested three billion yuan ($434m) in quantum technology deployment, according to Reuters. Quantum in The Cloud Some of America's biggest technology companies are investing in quantum computing. Today, it is even possible to access quantum computing facilities via the cloud, albeit at on small scale. IBM's cloud access to quantum computers is free for the most basic level, rising to $1.60 per second for the next level. And it is just the beginning. America's big tech companies are racing to build quantum computers at scale. One measure of scale is the size of a quantum processor, which is measured in qubits. While a traditional computer stores information as a 0 or 1, a qubit can represent both 0 and 1 simultaneously. This unique property enables a quantum computer to explore multiple potential solutions to a problem simultaneously; and the greater the stability of its qubits, the more efficient it becomes. IBM has a long history in quantum research and development. In 1998, it unveiled what was then a ground-breaking 2-qubit computer. By 2022, it had produced a 433-qubit processor, and in 2023, it aims to produce a 1,121-qubit processor. Separately, this month, it announced the construction of its first quantum data center in Europe, which it expects to begin offering commercial services as of next year. Google is also firmly in the race to build a large-scale quantum computer. In 2019, a paper in Nature featured Google’s Sycamore processor and the speed with which it undertakes computational tasks. More recent work includes an experimental demonstration of it’s possible to reduce errors by increasing the number of qubits. Microsoft reckons that "a quantum machine capable of solving many of the hardest problems facing humanity will ultimately require at least 1 million stable qubits that can perform 1 quintillion operations while making at most a single error." To this end, it is working on what it calls a new type of qubit, a topological qubit. Amazon announced in 2021 an AWS Center for Quantum Computing on the Caltech campus to build a fault-tolerant quantum computer.

Read More
Software, Low-Code App Development, Application Development Platform

Over the Waterfall to GitOps

Article | August 23, 2023

One of the first steps on the journey to cloud-native is transforming culture. This starts with embracing Agile methodology, followed by implementation of DevOps processes and eventually GitOps, as we explore in this extract from the recent e-book Mind the gap: bridging the skills divide on the journey to cloud native. Most CSPs agree that culture, including governance and skills, is the single biggest obstacle to adopting a cloud-native architecture. Traditional waterfall project management focuses on a linear progression, where one task or process needs to be completed before the next can start. This approach is time-consuming and costly, and it stifles innovation. It’s a major reason a CSP typically takes more than a year to develop a new service. Adopting Agile methodology is a completely new way of working that focuses on building cross-functional teams to speed innovation and service creation. This requires CSPs to seek individuals with the new project management skills and are adaptable and quick-thinking. Agile may not be suitable for every aspect of the business or for every project, but it is critical for moving to cloud-based, and eventually to cloud-native environments. Agile’s assumptions • Early, continuous delivery of software leads to happy customers • Changing requirements are always welcome, even in late development • Working software is delivered frequently • Business teams and developers work together every day • Projects are built around motivated and trusted individuals • Face-to-face is the best way to communicate • Working software is the principal measure of progress • Development is sustainable and constant • Attention to technical excellence and good design are required • Simplicity is essential • The best architectures emerge from self-organizing teams • Teams look for ways to be more effective and adjust accordingly There are lots of Agile approaches, but many CSPs use a model made popular by Spotify, which organizes teams into ‘squads,’ ‘tribes,’ ‘chapters,’ and ‘guilds.’ Vodafone Group follows this model and uses ‘very, very flat, non-hierarchical governance,’ according to Dr. Lester Thomas, Chief IT Systems Architect at Vodafone Group. “We’ve learned doing this in the digital space, but we’re trying to adopt that software approach right into our network.” Culture Eats Technology UScellular began adopting Agile methodology about five years ago, and the company is implementing cloud-native applications wherever they make sense. During its shift to the new way of working, cultural change has been the most difficult obstacle to overcome, significantly harder than technological change, according to Kevin Lowell, the company’s Chief People Officer and former Executive VP in charge of IT. The shift started with creating ‘a compelling why’ – in this case, improving how customers experience using UScellular services. The company replaced some waterfall processes with iterative Agile processes managed in scrums and implemented in sprints. The IT team also began meeting regularly with business stakeholders and educating them about how Agile works. Telecom Argentina is also embracing Agile. It is working with Red Hat to adopt a framework called Team Topologies to create a more efficient way of collaborating. The company is applying Team Topologies within its network division to create cross-functional teams that not only focus on the evolution and operation of technological platforms but also on creating and delivering services. From Agile to DevOps While Agile methodologies help to establish communication between IT teams and other stakeholders in the company, DevOps goes further by introducing an end-to-end software lifecycle that establishes a continuous flow of development, integration, testing, delivery and deployment. Google’s approach to DevOps, called Site Reliability Engineering (SRE), has been widely adopted in telecoms. It provides the foundation for the ODA Canvas, and it’s how Vodafone Group is implementing DevOps. Vodafone is a cloud-native pioneer. For the past several years, the company has been transforming into a platform provider, using what it calls a ‘telco-as-a-service’ or TaaS strategy. Vodafone is becoming a software company on its quest to become a techco, which involves hiring 7,000 software engineers to add to the existing 9,000 in the company. A key driver for embracing a cloud-native approach is “moving from our millions of human customers to billions of things,” says Thomas. Instead of offering just four primary services – fixed voice, broadband Internet, mobility and TV, he envisions using 5G network slicing to support thousands of IoT services per vertical market. “Unless we can drive this through software-driven approaches and automation, we’re not going to be successful,” he says. From DevOps to GitOps The problem with DevOps, however, is that most CSPs aren’t developing their software; they buy solutions from vendor partners. As Omdia’s James Crawshaw, Principal Analyst at Telco IT & Operations, notes in a research report, this makes it difficult for operators to create CI/CD pipelines that cut across organizational boundaries between CSPs and suppliers. To address this, CSPs “have adapted DevOps to their needs and created GitOps, which they use to take third-party applications and deploy on their own platforms,” Crawshaw explains. Philippe Ensarguet, Group CTO at Orange Business Services, recently explained that GitOps requires continuous integration and continuous operations or CI/CO. This means moving away from a prescriptive way of implementing operations to a declarative approach that supports full automation. What is GitOps? “If you rely mainly on the prescriptive approach, the day you want to move into production and scale up the number of applications you implement, you have to manage it purely with humans, and you hit the wall on scalability,” says Ensarguet. William Caban, Telco Chief Architect at Red Hat, sees GitOps as foundational to the concept of zero-touch, zero-wait and zero-trouble services, which will be orchestrated end-to-end in autonomous networks. “This is exactly what GitOps is about: event-driven, intent-based networks,” he says. “It becomes the operational model for architectures based on the ODA and autonomous networks.” CSPs must hire software and automation skills for GitOps. They also must reskill network experts, such as radio access network (RAN) engineers, to work in CI/CO teams so everyone uses common terminology. Some operators are going even further by creating centers of excellence (CoEs) where cross-functional teams from business, network and operations collaborate. “In GitOps, it is also necessary to codify team members’ knowledge, so that even as people move around or leave the company, the software development and operations lifecycle processes are not disrupted,” Caban says.

Read More
Software, Low-Code App Development, Application Development Platform

Empowering Industry 4.0 with Artificial Intelligence

Article | June 15, 2023

The next step in industrial technology is about robotics, computers and equipment becoming connected to the Internet of Things (IoT) and enhanced by machine learning algorithms. Industry 4.0 has the potential to be a powerful driver of economic growth, predicted to add between $500 billion- $1.5 trillion in value to the global economy between 2018 and 2022, according to a report by Capgemini.

Read More

How Artificial Intelligence Is Transforming Businesses

Article | February 12, 2020

Whilst there are many people that associate AI with sci-fi novels and films, its reputation as an antagonist to fictional dystopic worlds is now becoming a thing of the past, as the technology becomes more and more integrated into our everyday lives. AI technologies have become increasingly more present in our daily lives, not just with Alexa’s in the home, but also throughout businesses everywhere, disrupting a variety of different industries with often tremendous results. The technology has helped to streamline even the most mundane of tasks whilst having a breath-taking impact on a company’s efficiency and productivity

Read More

Spotlight

iXsystems

As the world leader in enterprise Open Source server and storage products, iXsystems builds and qualifies its hardware solutions for companies that leverage Open Source technology in their core IT business infrastructure. The driving principle behind everything we do at iXsystems is the belief that Open Source technology has the power to change the world through its process of open and collaborative innovation. Building products and services around this open ecosystem enables us to be successful and helps us empower our clients to do the same.

Related News

General AI

VMware Extends Updates for Tanzu Platform and Spring Framework

VMware | November 08, 2023

VMware updates Spring framework and Tanzu platform with AI and machine learning. Enhancements include DORA metrics, Spring Boot 3.2, Spring Framework 6.1, and new machine learning and AI capabilities in Tanzu Data Services. These updates aim to streamline application development, reduce costs, and enhance security for modern operating models in the generative AI economy. On November 7, 2023, VMware introduced the latest updates to its Spring framework and Tanzu platform, aiming to empower software development teams to build higher-performing applications that leverage cutting-edge technology like AI and machine learning more efficiently and securely. The enhancements are designed to streamline application development, reduce costs, and improve security while accommodating modern operating models such as cloud containers and serverless environments. Purnima Padmanabhan, Senior Vice President and General Manager of Modern Apps and Management Business Group at VMware remarked that the velocity of innovation is what differentiates companies from the competition. The next-generation apps’ value will be elevated through new capabilities like ML and AI and scalability across any cloud. She added that they have been focusing on providing developers with exceptional tools and experiences for decades. As they mark the 20th anniversary of Spring, their latest enhancements and deep integration to the Tanzu Platform give application teams the ability to leverage more cutting-edge technologies like AI in new apps and take these apps to production safely, quickly, and more securely. The Tanzu platform now incorporates DORA metrics to track software delivery performance and greater transparency in the developer portal. The integration of VMware Tanzu Spring Runtime into Tanzu Application Platform enhances the Java application development experience. Tanzu Application Service 5.0 offers new features, such as a Postgres tile for DBaaS and AI support. Spring-related updates include Spring Boot 3.2 and Spring Framework 6.1, which enable GraalVM native images for better app runtime scalability, energy efficiency, and RAM consumption. Spring AI simplifies AI application development using the familiar Spring Framework, and Tanzu Spring Health Assessment helps organizations identify security issues in their Spring application portfolio. Tanzu Data Services enhancements add machine learning and AI capabilities to data services. Tanzu Intelligence Services now include VMware Tanzu CloudHealth for achieving cloud sustainability goals, VMware Tanzu Guardrails for continuous compliance, and VMware Tanzu Application Catalog for open-source content security. The Tanzu platform will continue to integrate into a common control plane, VMware Tanzu Hub, offering a refreshed user experience, integrated observability, migration planning and assessments, cost reports, and enhanced Intelligent Assist capabilities. These updates aim to help organizations develop, operate, and optimize modern applications more effectively in the generative AI economy.

Read More

AI Tech

Silobreaker Releases AI for Swift and Precise Threat Intelligence

Silobreaker | November 07, 2023

Silobreaker, a leading security and threat intelligence technology company, has announced the launch of its new AI tool, Silobreaker AI. This tool is designed to assist threat intelligence teams in collecting, analyzing, and reporting on intelligence requirements, thereby enabling faster, intelligence-led decision-making within organizations. The AI tool, which will be integrated into the Silobreaker intelligence platform, can summarize and extract key information from Silobreaker’s own analyst content and generate topical threat reports. These reports can then be used by stakeholders to make informed decisions. Per Lindh, CTO of Silobreaker, described the tool as a 'cheat code' for threat intelligence teams, providing faster insights into threats and enabling decisive action to reduce risks. The tool also augments Silobreaker’s range of threat intelligence capabilities, adding computer-aided learning and automation techniques to streamline the collection, analysis, and dissemination of open-source intelligence data. While the introduction of Silobreaker AI promises to revolutionize threat intelligence, it's important to consider potential drawbacks. The reliance on AI could potentially lead to overdependence, reducing human oversight and possibly missing nuanced threats that require human judgement. Additionally, the effectiveness of the tool is dependent on the quality of the data it's trained on, which could limit its accuracy if not properly managed. On the other hand, the benefits are substantial. Silobreaker AI can accelerate the production of high-quality intelligence reports, enabling faster, more informed decision-making. It provides threat intelligence teams with faster insights into threats, allowing for more decisive action to reduce risks. The tool also augments Silobreaker’s range of threat intelligence capabilities, adding computer-aided learning and automation techniques to streamline the collection, analysis, and dissemination of open-source intelligence data. This could significantly improve efficiency and productivity in threat intelligence teams. About Silobreaker Silobreaker is a software-as-a-Service (SaaS) platform that specializes in threat intelligence. It streamlines the intelligence cycle, from managing cyber, physical, and geopolitical PIRs to collecting, processing, analyzing, and disseminating structured and unstructured data from various web sources. The platform aids intelligence teams in identifying and prioritizing threats, enabling decision-makers to mitigate risk, protect revenue, and drive business results swiftly. Silobreaker caters to a diverse clientele, including corporate, government, military, and financial services sectors, addressing various use-cases across cyber and corporate security, competitive intelligence, incident management, market intelligence, risk analysis, asset management, and general OSINT.

Read More

Software

BMC Software Announces AIOps in BMC Helix for AI-Optimized IT Ops

BMC Software | November 06, 2023

BMC Software's new AIOps capabilities in BMC Helix Operations Management use AI for quick IT issue resolution. The solution boosts IT operations in hybrid, multi-cloud environments, enhancing visibility and service performance. New features like service blueprints, causal AI-powered explainability, and AIOps situation fingerprinting expedite incident resolution and risk recovery. BMC Software, a global leader in IT solutions, announces new AIOps capabilities for its BMC Helix Operations Management solution using the BMC HelixGPT capability. The solution uses advanced AI to find problems' root causes more quickly. It changes the way IT works by adding dynamic service modeling, situation explainability, and deep container auto-detection to better understand containerized environments. As businesses grapple with complex hybrid, multi-cloud environments and increasing data volume and complexity, the need for advanced AI and machine learning to drive visibility, observability, and optimum business service performance is paramount. Nancy Gohring, research director for IDC's Enterprise System Management, Observability and AIOps program, emphasized the importance of modernizing IT operations in line with the adoption of hybrid and cloud-native technologies. AIOps capabilities that leverage AI to pinpoint problem causes, guide users to the correct response, and predict potential future issues are key to ensuring service delivery aligns with business outcomes. The BMC Helix Operations Management solution combines advanced causal AI to identify issue root causes, predictive AI for proactive problem identification and resolution, and generative AI for automating event summaries and best action recommendations for complex problems. These innovations enable IT operations to deliver higher service availability and resilience to businesses, driving efficient operational performance with greater tool silo visibility and superior AI-driven insights for significantly improved problem identification and repair times. The new BMC Helix Operations Management innovations include out-of-the-box service blueprints, situation explainability powered by causal AI, and AIOps situation fingerprinting powered by AI, GPT, and NLP. These features ensure accurate service models in ever-changing IT environments, swift incident resolution, and faster recovery from service outages and other potential risks. While the new AIOps capabilities in BMC's Helix Operations Management solution offer a host of benefits, they also present potential challenges. The complexity of AI systems can lead to difficulties in understanding and controlling their operations, which could pose challenges in troubleshooting and rectifying issues. Additionally, the heavy reliance on AI might reduce the level of human oversight in IT operations, which could be risky in certain scenarios. The effectiveness of the solution is also heavily dependent on the quality and quantity of data it receives, which might not always be optimal in real-world scenarios. On the brighter side, the benefits of this solution are substantial. The use of advanced AI capabilities allows for swift identification and resolution of IT issues, greatly improving operational efficiency. The solution's ability to enhance IT operations in complex hybrid, multi-cloud environments is a significant advantage, as it provides much-needed visibility and service performance. The new features, including out-of-the-box service blueprints, causal AI-powered explainability, and AIOps situation fingerprinting, ensure swift incident resolution and faster recovery from potential risks. These innovations lead to higher service availability and resilience, which are crucial for businesses in today's digital age. Overall, despite some potential challenges, the BMC Helix Operations Management solution's new AIOps capabilities present a promising advancement in the field of IT operations management.

Read More

General AI

VMware Extends Updates for Tanzu Platform and Spring Framework

VMware | November 08, 2023

VMware updates Spring framework and Tanzu platform with AI and machine learning. Enhancements include DORA metrics, Spring Boot 3.2, Spring Framework 6.1, and new machine learning and AI capabilities in Tanzu Data Services. These updates aim to streamline application development, reduce costs, and enhance security for modern operating models in the generative AI economy. On November 7, 2023, VMware introduced the latest updates to its Spring framework and Tanzu platform, aiming to empower software development teams to build higher-performing applications that leverage cutting-edge technology like AI and machine learning more efficiently and securely. The enhancements are designed to streamline application development, reduce costs, and improve security while accommodating modern operating models such as cloud containers and serverless environments. Purnima Padmanabhan, Senior Vice President and General Manager of Modern Apps and Management Business Group at VMware remarked that the velocity of innovation is what differentiates companies from the competition. The next-generation apps’ value will be elevated through new capabilities like ML and AI and scalability across any cloud. She added that they have been focusing on providing developers with exceptional tools and experiences for decades. As they mark the 20th anniversary of Spring, their latest enhancements and deep integration to the Tanzu Platform give application teams the ability to leverage more cutting-edge technologies like AI in new apps and take these apps to production safely, quickly, and more securely. The Tanzu platform now incorporates DORA metrics to track software delivery performance and greater transparency in the developer portal. The integration of VMware Tanzu Spring Runtime into Tanzu Application Platform enhances the Java application development experience. Tanzu Application Service 5.0 offers new features, such as a Postgres tile for DBaaS and AI support. Spring-related updates include Spring Boot 3.2 and Spring Framework 6.1, which enable GraalVM native images for better app runtime scalability, energy efficiency, and RAM consumption. Spring AI simplifies AI application development using the familiar Spring Framework, and Tanzu Spring Health Assessment helps organizations identify security issues in their Spring application portfolio. Tanzu Data Services enhancements add machine learning and AI capabilities to data services. Tanzu Intelligence Services now include VMware Tanzu CloudHealth for achieving cloud sustainability goals, VMware Tanzu Guardrails for continuous compliance, and VMware Tanzu Application Catalog for open-source content security. The Tanzu platform will continue to integrate into a common control plane, VMware Tanzu Hub, offering a refreshed user experience, integrated observability, migration planning and assessments, cost reports, and enhanced Intelligent Assist capabilities. These updates aim to help organizations develop, operate, and optimize modern applications more effectively in the generative AI economy.

Read More

AI Tech

Silobreaker Releases AI for Swift and Precise Threat Intelligence

Silobreaker | November 07, 2023

Silobreaker, a leading security and threat intelligence technology company, has announced the launch of its new AI tool, Silobreaker AI. This tool is designed to assist threat intelligence teams in collecting, analyzing, and reporting on intelligence requirements, thereby enabling faster, intelligence-led decision-making within organizations. The AI tool, which will be integrated into the Silobreaker intelligence platform, can summarize and extract key information from Silobreaker’s own analyst content and generate topical threat reports. These reports can then be used by stakeholders to make informed decisions. Per Lindh, CTO of Silobreaker, described the tool as a 'cheat code' for threat intelligence teams, providing faster insights into threats and enabling decisive action to reduce risks. The tool also augments Silobreaker’s range of threat intelligence capabilities, adding computer-aided learning and automation techniques to streamline the collection, analysis, and dissemination of open-source intelligence data. While the introduction of Silobreaker AI promises to revolutionize threat intelligence, it's important to consider potential drawbacks. The reliance on AI could potentially lead to overdependence, reducing human oversight and possibly missing nuanced threats that require human judgement. Additionally, the effectiveness of the tool is dependent on the quality of the data it's trained on, which could limit its accuracy if not properly managed. On the other hand, the benefits are substantial. Silobreaker AI can accelerate the production of high-quality intelligence reports, enabling faster, more informed decision-making. It provides threat intelligence teams with faster insights into threats, allowing for more decisive action to reduce risks. The tool also augments Silobreaker’s range of threat intelligence capabilities, adding computer-aided learning and automation techniques to streamline the collection, analysis, and dissemination of open-source intelligence data. This could significantly improve efficiency and productivity in threat intelligence teams. About Silobreaker Silobreaker is a software-as-a-Service (SaaS) platform that specializes in threat intelligence. It streamlines the intelligence cycle, from managing cyber, physical, and geopolitical PIRs to collecting, processing, analyzing, and disseminating structured and unstructured data from various web sources. The platform aids intelligence teams in identifying and prioritizing threats, enabling decision-makers to mitigate risk, protect revenue, and drive business results swiftly. Silobreaker caters to a diverse clientele, including corporate, government, military, and financial services sectors, addressing various use-cases across cyber and corporate security, competitive intelligence, incident management, market intelligence, risk analysis, asset management, and general OSINT.

Read More

Software

BMC Software Announces AIOps in BMC Helix for AI-Optimized IT Ops

BMC Software | November 06, 2023

BMC Software's new AIOps capabilities in BMC Helix Operations Management use AI for quick IT issue resolution. The solution boosts IT operations in hybrid, multi-cloud environments, enhancing visibility and service performance. New features like service blueprints, causal AI-powered explainability, and AIOps situation fingerprinting expedite incident resolution and risk recovery. BMC Software, a global leader in IT solutions, announces new AIOps capabilities for its BMC Helix Operations Management solution using the BMC HelixGPT capability. The solution uses advanced AI to find problems' root causes more quickly. It changes the way IT works by adding dynamic service modeling, situation explainability, and deep container auto-detection to better understand containerized environments. As businesses grapple with complex hybrid, multi-cloud environments and increasing data volume and complexity, the need for advanced AI and machine learning to drive visibility, observability, and optimum business service performance is paramount. Nancy Gohring, research director for IDC's Enterprise System Management, Observability and AIOps program, emphasized the importance of modernizing IT operations in line with the adoption of hybrid and cloud-native technologies. AIOps capabilities that leverage AI to pinpoint problem causes, guide users to the correct response, and predict potential future issues are key to ensuring service delivery aligns with business outcomes. The BMC Helix Operations Management solution combines advanced causal AI to identify issue root causes, predictive AI for proactive problem identification and resolution, and generative AI for automating event summaries and best action recommendations for complex problems. These innovations enable IT operations to deliver higher service availability and resilience to businesses, driving efficient operational performance with greater tool silo visibility and superior AI-driven insights for significantly improved problem identification and repair times. The new BMC Helix Operations Management innovations include out-of-the-box service blueprints, situation explainability powered by causal AI, and AIOps situation fingerprinting powered by AI, GPT, and NLP. These features ensure accurate service models in ever-changing IT environments, swift incident resolution, and faster recovery from service outages and other potential risks. While the new AIOps capabilities in BMC's Helix Operations Management solution offer a host of benefits, they also present potential challenges. The complexity of AI systems can lead to difficulties in understanding and controlling their operations, which could pose challenges in troubleshooting and rectifying issues. Additionally, the heavy reliance on AI might reduce the level of human oversight in IT operations, which could be risky in certain scenarios. The effectiveness of the solution is also heavily dependent on the quality and quantity of data it receives, which might not always be optimal in real-world scenarios. On the brighter side, the benefits of this solution are substantial. The use of advanced AI capabilities allows for swift identification and resolution of IT issues, greatly improving operational efficiency. The solution's ability to enhance IT operations in complex hybrid, multi-cloud environments is a significant advantage, as it provides much-needed visibility and service performance. The new features, including out-of-the-box service blueprints, causal AI-powered explainability, and AIOps situation fingerprinting, ensure swift incident resolution and faster recovery from potential risks. These innovations lead to higher service availability and resilience, which are crucial for businesses in today's digital age. Overall, despite some potential challenges, the BMC Helix Operations Management solution's new AIOps capabilities present a promising advancement in the field of IT operations management.

Read More

Events