Top Trends going to Impact iOS Application Development in 2019

The latest version of iOS 12.1.4 was released last week. Apple has fixed a security flaw of Group FaceTime in this version. Well, we are not going to discuss it! What is noteworthy is the fact that the Cupertino Company Apple is committed to providing seamless performance to the iOS device users. Since the release of the first version as iPhone OS in the year 2007, Apple has kept the iPhone and iPad users in mind.

Spotlight

Harmony Healthcare IT, the Makers of Health Data Archiver

Technology drives the accessibility, security, efficiency and quality of your clinical data. That means that it has to be right, 24/7. No exceptions. No excuses. Each element of your information framework must work with the next to deliver system reliability and data integrity. At Harmony Healthcare IT, we specialize solely in healthcare technology.

OTHER ARTICLES
Software, Low-Code App Development, Application Development Platform

Are Telcos Ready for a Quantum Leap?

Article | July 3, 2023

Quantum technologies present both an opportunity for telcos to solve difficult problems and provide new services and a security threat that could require extensive IT investment. Are Telcos Ready for a Quantum Leap? When Andrew Lord, Senior Manager, Optical Networks and Quantum Research at BT, first started presenting quantum technologies at customer events six or seven years ago, his was the graveyard shift, he says, entertaining attendees at the end of the day with talk of 'crazy quantum stuff.' "But that is no longer the case," says Lord. "Over the last two years, I've noticed a shift where I now speak before lunch, and customers actively seek us out." Two developments may be causing the shift: Customers’ growing awareness of the threats and opportunities that quantum computing presents, plus a recent spike in investment in quantum technology. In 2022, investors plowed $2.35 billion into quantum technology startups, which include companies in quantum computing, communications and sensing, according to McKinsey. The public sector has also been digging deep into its pockets. Last year, the United States added $1.8 billion to its previous spending on quantum technology, and the EU committed an extra $1.2 billion, the consultancy noted, while China made total investments of $15.3 billion. According to Luke Ibbetson, Head of Group R&D at Vodafone, quantum computing's promise lies in solving a probabilistic equation within a few hours. This task would take a classical computer a million years to accomplish. This breakthrough would enable telcos to address optimization problems related to network planning, optimization, and base station placement. The flip side is that a powerful quantum computer could also break the public-key cryptography that protects today’s IT systems from hackers. As a spokesperson at Deutsche Telekom remarks: “Telcos will have to react to the threat of quantum computers to communication security because their core business model is at risk, which is offering secure digital communications.” The idea of quantum computing posing a security threat is not new. In 1994, Peter Shor, a mathematician working at AT&T Bell Labs, showed how a quantum computer could solve the logarithms used to encrypt data. “His work simultaneously ignited multiple new lines of research in quantum computing, information science, and cryptography,” according to an article by the Massachusetts Institute of Technology, where Shor is currently working. Beyond The Lab What has changed nearly thirty years on is that quantum computing is creeping out of the lab. Sizeable obstacles to large-scale quantum computing, however, remain. Quantum computers are highly sensitive to interference from noise, temperature, movement or electromagnetic fields and, therefore, very difficult and expensive to build and operate, especially at scale: IBM’s latest quantum processor, for example, operates at a very low temperature of approximately 0.02 degrees Kelvin. When Deutsche Telekom’s T-Labs tested telco use cases, it found quantum computing coped well with small problem statements. “However, when the problem size was scaled to real-world problem sizes, the quality of the QComp solution degraded,” according to the spokesperson. The company is now awaiting the next generation of quantum computing platforms to redo the analyses. All of this means, for now, quantum computers are not large and powerful enough to crack Shor’s algorithm. The question is, when will someone succeed? The Global Risk Institute tracks the quantum threat timeline. In its latest annual report, the organization asked 40 quantum experts whether they thought it likely that within the next ten years, a quantum computer would break an encryption scheme like RSA-2048 in under 24 hours. Over half the respondents judged the event to be more than 5% likely, and almost a quarter considered it to be more than 50% likely. Any breakthrough will come from a relatively small number of actors. Today, governments and academic institutions are home to around half of the 163 projects accounted for worldwide by Global Quantum Intelligence, a research and analysis company, according to its CEO, André M. König, with big technology companies and specialized startups accounting for the rest. Q2K Nonetheless, the impact of quantum computing could be widespread, even if relatively few of them are built. The challenge of preparing for a post-quantum future is often called Q2K in reference to the Y2K bug. In the late 1990s, many (but not all) governmental organizations and companies spent millions of dollars on Y2K systems integration to ensure that IT programs written from the 1960s through the 1980s would be able to recognize dates after December 31, 1999, all while being uncertain of the scale or the impact of the risk if they didn’t. ‘Q2K’ differs in that there is no specific deadline, and the dangers of a major security breach are much clearer cut. However, it is similar in demanding a lot of work on aging systems. “Cryptography is used everywhere,” points out Lory Thorpe, IBM’s Director of Global Solutions and Offerings, Telecommunications. She adds, “Because telco systems have been built over periods of decades, people don’t actually know where cryptography is being used. So, if you start to look at the impact of public key cryptography and digital signatures being compromised, you start to look at how those two things impact open source, how that impacts the core network, the radio network, [and] OSS/BSS, network management, how the network management speaks to the network functions and so on.” This complexity is why some analysts recommend that telcos take action now. “You’re going to find tens of thousands of vulnerabilities that are critical and vulnerable to a quantum attack. So, do you have to worry about it today? Absolutely - even if it’s in 2035,” says König. “Anyone who has ever done [IT implementation projects], and anyone who’s ever worked in cybersecurity [knows], tens of thousands of vulnerabilities that are critical [requires] years and years and years of just traditional integration work. So, even if you’re skeptical about quantum, if you haven’t started today, it is almost too late already.” Don’t Panic! For the past two to three years, Vodafone has been preparing to migrate some of its cryptographic systems to be quantum-safe, according to Ibbetson. He believes there is no need to panic about this. However, telcos must start planning now. König said, "The telecoms industry as a whole is not moving as quickly as some other sectors, notably the banking, pharmaceutical, and automotive industries. In these sectors, post-quantum security planning often involves CEOs at a very strategic level." For this reason, Vodafone joined forces with IBM in September 2022 to establish the GSMA Post-Quantum Telco Network Taskforce. “Even though many industries are preparing to be able to defend against future quantum threats, we didn’t see anything happening particularly in in the telco space, and we wanted to make sure that it was a focus,” says Ibbetson. “Obviously it will turn into an IT-style transformation, but it’s starting now with understanding what it is we need to mobilize that.” AT&T has also been working to pinpoint what needs to be addressed. Last year, the company said it aims to be quantum-ready by 2025, in the sense that it will have done its due diligence and identified a clear path forward. Minding Your PQCs Companies across multiple sectors are looking to post-quantum cryptography (PQC) to secure their systems, which will use new algorithms that are much harder to crack than RSA. König contends that PQC needs to become “a standard component of companies’ agile defense posture” and believes the development of PQC systems by software and hardware companies will help keep upgrade costs under control. “From a financial point of view, vendors do a fantastic job bringing this to market and making it very accessible,” says König. Lord, who has been researching quantum technologies at BT for over a decade, is also confident that there is “going to be much more available technology.” As a result, even smaller telcos will be able to invest in securing their systems. “It doesn't need a big boy with lots of money [for] research to do something around PQC. There’s a lot of work going on to ratify the best of those solutions,” says Lord. There are several reasons why eyes are on software based PQC. Firstly, it can be used to secure data that was encrypted in the past, quantum computing advances will make vulnerable in the future. In addition, the quantum-based alternative to PQC for securing network traffic called quantum key distribution (QKD), comes with a huge drawback for wireless operators. QKD is hardware-based and uses quantum mechanics to prevent interception across optical fiber and satellite (i.e., free space optical) networks, making it secure, albeit expensive. But for reasons of physics, it does not work on mobile networks. Setting Standards Given the importance of PQC, a lot of effort is going into standardizing robust algorithms. The political weight of the US and the size of its technology industry mean that the US government’s National Institute of Standards and Technology (NIST) is playing a key role in the technical evaluation of post-quantum standardization algorithms and creating standards. NIST expects to publish the first set of post-quantum cryptography standards in 2024. In the meantime, Dustin Moody, a NIST mathematician, recommends (in answers emailed to inform) that companies “become familiar and do some testing with the algorithms being standardized, and how they will fit in your products and applications. Ensure that you are using current best-practice cryptographic algorithms and security strengths in your existing applications. Have somebody designated to be leading the effort to transition. QKD There is no absolute guarantee, however, that a quantum computer in the future won’t find a way to crack PQC. Therefore, institutions such as government agencies and banks remain interested in using QKD fiber and satellite networks to ensure the highest levels of security for data transmission. The European Commission, for example, is working with the 27 EU Member States and the European Space Agency (ESA) to design, develop and deploy a QKD-based European Quantum Communication Infrastructure (EuroQCI). It will be made up of fiber networks linking strategic sites at national and cross-border levels and a space segment based on satellites. EuroQCI will reinforce the protection of Europe’s governmental institutions, their data centers, hospitals, energy grids, and more,” according to the EU. Telecom operators are involved in some of the national programs, including Orange, which is coordinating France’s part of the program called FranceQCI (Quantum Communication Infrastructure). Separately, this month, Toshiba and Orange announced they had successfully demonstrated the viability of deploying QKD on existing commercial networks. Outside the EU, BT has already built and is now operating a commercial metro quantum-encryption network in London. “The London network has three quantum nodes, which are the bearers carrying the quantum traffic for all of the access ingress,” explains Lord. For example, a customer in London's Canary Wharf could link via the network to the nearest quantum-enabled BT exchange. From there, it joins a metro network, which carries the keys from multiple customers “in an aggregated cost-effective way to the egress points,” according to Lord. “It is not trivial because you can mess things up and [get] the wrong keys,” explains Lord. “You really have to be more careful about authentication and key management. And then it's all about how you engineer your quantum resources to handle bigger aggregation.” It also gives BT the opportunity to explore how to integrate quantum systems downstream into its whole network. “What I'm telling the quantum world is that they need to get into the real world because a system that uses quantum is still going to be 90%, non-quantum and all of the usual networking rules and engineering practices apply. You still need to know how to handle fiber. You still need to know how to provision a piece of equipment and integrate it into a network.” SK Telecom is also heavily involved in quantum-related research, with developments including QKD systems for the control and interworking of quantum cryptography communication networks. Japan is another important center of QKD research. A QKD network has existed in Tokyo since 2010, and in 2020, financial services company Nomura Securities Co., Ltd. tested the transmission of data across the Tokyo QKD network. As the EU’s project makes clear, satellite is an important part of the mix. Lord expects satellite-based QKD networks to come on stream as of 2025 and 2026, enabling the purchase of wholesale quantum keys from a dedicated satellite quantum provider. Back in 2017, China already used the satellite to make the first very long-distance transmission of data secured by QKD between Beijing and Vienna, a distance of 7,000km. Securing The Edge There are additional efforts to secure communications with edge devices. BT’s Lord, for example, sees a role for digital fingerprints for IoT devices, phones, cars and smart meters in the form of a physical unclonable function (PUF) silicon chip, which, because of random imperfections in its manufacture, cannot be copied. In the UK, BT is trialing a combination of QKD and PUF to secure the end-to-end journey of a driverless car. The connection to the roadside depends on standard radio with PUF authentication, while transmission from the roadside unit onward, as well as the overall control of the autonomous vehicle network, incorporate QKD, explains Lord. SK Telecom has developed what it describes as a quantum-enhanced cryptographic chip with Korea Computer & Systems (KCS) and ID Quantique. Telefónica Spain has partnered on the development of a quantum-safe 5G SIM card and has integrated quantum technology into its cloud service hosted in its virtual data centers. Given China’s heavy investment in quantum technologies, it is no surprise to see its telecom operators involved in the field. China Telecom, for example, recently invested three billion yuan ($434m) in quantum technology deployment, according to Reuters. Quantum in The Cloud Some of America's biggest technology companies are investing in quantum computing. Today, it is even possible to access quantum computing facilities via the cloud, albeit at on small scale. IBM's cloud access to quantum computers is free for the most basic level, rising to $1.60 per second for the next level. And it is just the beginning. America's big tech companies are racing to build quantum computers at scale. One measure of scale is the size of a quantum processor, which is measured in qubits. While a traditional computer stores information as a 0 or 1, a qubit can represent both 0 and 1 simultaneously. This unique property enables a quantum computer to explore multiple potential solutions to a problem simultaneously; and the greater the stability of its qubits, the more efficient it becomes. IBM has a long history in quantum research and development. In 1998, it unveiled what was then a ground-breaking 2-qubit computer. By 2022, it had produced a 433-qubit processor, and in 2023, it aims to produce a 1,121-qubit processor. Separately, this month, it announced the construction of its first quantum data center in Europe, which it expects to begin offering commercial services as of next year. Google is also firmly in the race to build a large-scale quantum computer. In 2019, a paper in Nature featured Google’s Sycamore processor and the speed with which it undertakes computational tasks. More recent work includes an experimental demonstration of it’s possible to reduce errors by increasing the number of qubits. Microsoft reckons that "a quantum machine capable of solving many of the hardest problems facing humanity will ultimately require at least 1 million stable qubits that can perform 1 quintillion operations while making at most a single error." To this end, it is working on what it calls a new type of qubit, a topological qubit. Amazon announced in 2021 an AWS Center for Quantum Computing on the Caltech campus to build a fault-tolerant quantum computer.

Read More
Software, Future Tech, Application Development Platform

Over the Waterfall to GitOps

Article | August 7, 2023

One of the first steps on the journey to cloud-native is transforming culture. This starts with embracing Agile methodology, followed by implementation of DevOps processes and eventually GitOps, as we explore in this extract from the recent e-book Mind the gap: bridging the skills divide on the journey to cloud native. Most CSPs agree that culture, including governance and skills, is the single biggest obstacle to adopting a cloud-native architecture. Traditional waterfall project management focuses on a linear progression, where one task or process needs to be completed before the next can start. This approach is time-consuming and costly, and it stifles innovation. It’s a major reason a CSP typically takes more than a year to develop a new service. Adopting Agile methodology is a completely new way of working that focuses on building cross-functional teams to speed innovation and service creation. This requires CSPs to seek individuals with the new project management skills and are adaptable and quick-thinking. Agile may not be suitable for every aspect of the business or for every project, but it is critical for moving to cloud-based, and eventually to cloud-native environments. Agile’s assumptions • Early, continuous delivery of software leads to happy customers • Changing requirements are always welcome, even in late development • Working software is delivered frequently • Business teams and developers work together every day • Projects are built around motivated and trusted individuals • Face-to-face is the best way to communicate • Working software is the principal measure of progress • Development is sustainable and constant • Attention to technical excellence and good design are required • Simplicity is essential • The best architectures emerge from self-organizing teams • Teams look for ways to be more effective and adjust accordingly There are lots of Agile approaches, but many CSPs use a model made popular by Spotify, which organizes teams into ‘squads,’ ‘tribes,’ ‘chapters,’ and ‘guilds.’ Vodafone Group follows this model and uses ‘very, very flat, non-hierarchical governance,’ according to Dr. Lester Thomas, Chief IT Systems Architect at Vodafone Group. “We’ve learned doing this in the digital space, but we’re trying to adopt that software approach right into our network.” Culture Eats Technology UScellular began adopting Agile methodology about five years ago, and the company is implementing cloud-native applications wherever they make sense. During its shift to the new way of working, cultural change has been the most difficult obstacle to overcome, significantly harder than technological change, according to Kevin Lowell, the company’s Chief People Officer and former Executive VP in charge of IT. The shift started with creating ‘a compelling why’ – in this case, improving how customers experience using UScellular services. The company replaced some waterfall processes with iterative Agile processes managed in scrums and implemented in sprints. The IT team also began meeting regularly with business stakeholders and educating them about how Agile works. Telecom Argentina is also embracing Agile. It is working with Red Hat to adopt a framework called Team Topologies to create a more efficient way of collaborating. The company is applying Team Topologies within its network division to create cross-functional teams that not only focus on the evolution and operation of technological platforms but also on creating and delivering services. From Agile to DevOps While Agile methodologies help to establish communication between IT teams and other stakeholders in the company, DevOps goes further by introducing an end-to-end software lifecycle that establishes a continuous flow of development, integration, testing, delivery and deployment. Google’s approach to DevOps, called Site Reliability Engineering (SRE), has been widely adopted in telecoms. It provides the foundation for the ODA Canvas, and it’s how Vodafone Group is implementing DevOps. Vodafone is a cloud-native pioneer. For the past several years, the company has been transforming into a platform provider, using what it calls a ‘telco-as-a-service’ or TaaS strategy. Vodafone is becoming a software company on its quest to become a techco, which involves hiring 7,000 software engineers to add to the existing 9,000 in the company. A key driver for embracing a cloud-native approach is “moving from our millions of human customers to billions of things,” says Thomas. Instead of offering just four primary services – fixed voice, broadband Internet, mobility and TV, he envisions using 5G network slicing to support thousands of IoT services per vertical market. “Unless we can drive this through software-driven approaches and automation, we’re not going to be successful,” he says. From DevOps to GitOps The problem with DevOps, however, is that most CSPs aren’t developing their software; they buy solutions from vendor partners. As Omdia’s James Crawshaw, Principal Analyst at Telco IT & Operations, notes in a research report, this makes it difficult for operators to create CI/CD pipelines that cut across organizational boundaries between CSPs and suppliers. To address this, CSPs “have adapted DevOps to their needs and created GitOps, which they use to take third-party applications and deploy on their own platforms,” Crawshaw explains. Philippe Ensarguet, Group CTO at Orange Business Services, recently explained that GitOps requires continuous integration and continuous operations or CI/CO. This means moving away from a prescriptive way of implementing operations to a declarative approach that supports full automation. What is GitOps? “If you rely mainly on the prescriptive approach, the day you want to move into production and scale up the number of applications you implement, you have to manage it purely with humans, and you hit the wall on scalability,” says Ensarguet. William Caban, Telco Chief Architect at Red Hat, sees GitOps as foundational to the concept of zero-touch, zero-wait and zero-trouble services, which will be orchestrated end-to-end in autonomous networks. “This is exactly what GitOps is about: event-driven, intent-based networks,” he says. “It becomes the operational model for architectures based on the ODA and autonomous networks.” CSPs must hire software and automation skills for GitOps. They also must reskill network experts, such as radio access network (RAN) engineers, to work in CI/CO teams so everyone uses common terminology. Some operators are going even further by creating centers of excellence (CoEs) where cross-functional teams from business, network and operations collaborate. “In GitOps, it is also necessary to codify team members’ knowledge, so that even as people move around or leave the company, the software development and operations lifecycle processes are not disrupted,” Caban says.

Read More
Software, Low-Code App Development, Application Development Platform

Empowering Industry 4.0 with Artificial Intelligence

Article | August 23, 2023

The next step in industrial technology is about robotics, computers and equipment becoming connected to the Internet of Things (IoT) and enhanced by machine learning algorithms. Industry 4.0 has the potential to be a powerful driver of economic growth, predicted to add between $500 billion- $1.5 trillion in value to the global economy between 2018 and 2022, according to a report by Capgemini.

Read More

How Artificial Intelligence Is Transforming Businesses

Article | February 12, 2020

Whilst there are many people that associate AI with sci-fi novels and films, its reputation as an antagonist to fictional dystopic worlds is now becoming a thing of the past, as the technology becomes more and more integrated into our everyday lives. AI technologies have become increasingly more present in our daily lives, not just with Alexa’s in the home, but also throughout businesses everywhere, disrupting a variety of different industries with often tremendous results. The technology has helped to streamline even the most mundane of tasks whilst having a breath-taking impact on a company’s efficiency and productivity

Read More

Spotlight

Harmony Healthcare IT, the Makers of Health Data Archiver

Technology drives the accessibility, security, efficiency and quality of your clinical data. That means that it has to be right, 24/7. No exceptions. No excuses. Each element of your information framework must work with the next to deliver system reliability and data integrity. At Harmony Healthcare IT, we specialize solely in healthcare technology.

Related News

AI Tech

IBM Expands Relationship with AWS to Bring Generative AI Solutions and Dedicated Expertise to Clients

PR Newswire | October 20, 2023

IBM (NYSE: IBM) today announced an expansion of its relationship with Amazon Web Services (AWS) to help more mutual clients operationalize and derive value from generative artificial intelligence (AI). As part of this, IBM Consulting aims to deepen and expand its generative AI expertise on AWS by training 10,000 consultants by the end of 2024; the two organizations also plan to deliver joint solutions and services upgraded with generative AI capabilities designed to help clients across critical use cases. IBM Consulting and AWS already serve clients across a variety of industries with a range of AI solutions and services. Now, the companies are enhancing those solutions and services with the power of generative AI designed to help clients integrate AI quickly into business and IT operations building on AWS. IBM Consulting and AWS plan to start with these specific solutions: Contact Center Modernization with Amazon Connect – IBM Consulting worked with AWS to create summarization and categorization functions for voice and digital interactions using generative AI, which are designed to allow for transfers between the chatbot and live agent and provide the agent with summarized details that expedite resolution times and improve quality management. Platform Services on AWS – Initially introduced in November 2022, this offering is newly upgraded with generative AI to better manage the entire cloud value chain including IT Ops, automation, and platform engineering. The new generative AI capabilities give clients tools to enhance business serviceability and availability for their applications hosted on AWS through intelligent issue resolution and observability techniques. Clients can expect an improvement of uptime and mean time repair which means they can act quickly and effectively to potential issues that arise. Supply Chain Ensemble on AWS – This planned offering will introduce a virtual assistant that can help accelerate and augment the work of supply chain professionals as they aim to deliver on customer expectations, optimize inventories, reduce costs, streamline logistics, and assess supply chain risks. Additionally, for clients looking to modernize on AWS, IBM Consulting plans to integrate AWS generative AI services into its proprietary IBM Consulting Cloud Accelerator to help accelerate the cloud transformation process. This will help with reverse engineering, code generation and code conversion. Commitment to deepening expertise and expanding AWS on watsonx integration IBM has already built extensive expertise with AWS's generative AI services including Amazon SageMaker and Amazon CodeWhisperer, and is one of first AWS Partners to use Amazon Bedrock, a fully managed service that makes industry-leading foundation models (FMs) available through an API, so clients can choose the model that's best suited for their use case. AI expertise and a deep understanding of AWS capabilities are critical for clients looking to implement generative AI, and IBM is already providing mutual clients with access to professionals from IBM Consulting's Center of Excellence for Generative AI with specialized generative AI expertise. With today's news, IBM Consulting plans to train and skill 10,000 consultants on AWS generative AI services by end of 2024. They will have access to an exclusive, partner-only program that provides training on the top use cases and best practices for client engagement with AWS generative AI services. This will help advance their knowledge, allow them to engage with technical professionals and better serve clients innovating on AWS. Enterprise clients are looking for expert help to build a strategy and develop generative AI use cases that can drive business value and transformation – while mitigating risks, said Manish Goyal, Senior Partner, Global AI & Analytics Leader at IBM Consulting. Paired with IBM's AI heritage and deep expertise in business transformation on AWS, this suite of reengineered solutions with embedded generative AI capabilities can help our mutual clients to scale generative AI applications rapidly and responsibly on their platform of choice. IBM is also responding to client demand for generative AI capabilities on AWS by making watsonx.data, a fit-for-purpose data store built on an open lakehouse architecture, available on AWS as a fully managed software-as-a-service (SaaS) solution– which clients can also access in AWS Marketplace. The company also plans to make watsonx.ai and watsonx.governance available on AWS by 2024. This builds on previous commitments made by the two companies to make it easier for clients to consume IBM data, AI and security software on AWS. "Our customers are increasingly looking for the technical support and AI expertise they need to build and implement a generative AI strategy that drives business value from their entire cloud value chain," said Chris Niederman, Managing Director, Global Systems Integrators at AWS. "We are excited to be working with IBM to include embedded generative AI capabilities that assist our mutual customers scale their applications – and help IBM consultants deepen their expertise on best practices for customer engagement with AWS generative AI services." Generative AI at scale for telecommunications Clients are already benefitting from the longstanding relationship between IBM and AWS. Bouygues Telecom, a leading French communications service provider with a history of industry-leading innovation, engaged IBM Consulting to support the company's evolving cloud strategy to explore, design and implement AI use cases at scale while giving teams flexibility to select cloud and AI providers based on departmental and application needs. Leveraging the IBM Garage approach, the team co-designed a custom data and AI reference architecture covering multiple cloud scenarios that can extend to all AI and data projects across Bouygues Telecom's cloud and on-premises platforms. "As we sought to leverage generative AI to extract insights from our engagements with clients, we were confronted with some unfamiliar issues around storage, memory size and power requirements," said Matthieu Dupuis, Head of AI for Bouygues Telecom. "IBM Consulting and AWS have been invaluable partners in identifying the right model for our needs and overcoming these technological barriers." With the new AI platform on AWS, IBM Consulting enabled Bouygues Telecom to develop proof-of-concept models and scale them into production quickly while helping to minimize costs and risks. The platform also enables their data scientists to work with greater efficiency, purpose, and satisfaction by allowing them to spend more time on complex, high-value AI projects rather than launching standalone solutions. An evolving relationship With over 40 years of combined experience on AI solutions, IBM and AWS have been working together to respond to clients who are looking to leverage AI for cost, efficiency, and growth, whether they are looking for a demonstration of the technology, defining potential use cases or full co-creation of bespoke solutions. Additionally, IBM is an AWS Premier Tier Services Partner with over 22,000 AWS certifications globally and has achieved 17 AWS Service Delivery and 16 AWS Competency designations. Today's news builds on this longstanding relationship and a shared value of the importance of enterprise AI. Getting to enterprise AI at scale requires a human-centric, principled approach, and IBM Consulting helps clients establish guardrails that align with the organization's values and standards, mitigate bias and manage data security, lineage, and provenance. IBM Consulting accelerates business transformation for our clients through hybrid cloud and AI technologies, leveraging our open ecosystem of partners. With deep industry expertise spanning strategy, experience design, technology, and operations, we have become the trusted partner to many of the world's most innovative and valuable companies, helping modernize and secure their most complex systems. Our 160,000 consultants embrace an open way of working and apply our proven co-creation method, IBM Garage, to scale ideas into outcomes. Statements regarding IBM's future direction and intent are subject to change or withdrawal without notice, and represent goals and objectives only.

Read More

AI Tech

AMD Enhances AI with Nod.ai Acquisition for Open-Source Solutions

AMD | October 11, 2023

AMD acquires Nod.ai to boost open-source AI solutions. AMD's Senior Vice President expects smoother AI deployment with Nod.AI. Nod.AI's SHARK software speeds up AI model deployment, in line with AMD's innovation focus. Advanced Micro Devices (AMD) has announced its definitive agreement to acquire Nod.ai, a leading open-source AI software expert. This strategic move is set to bolster AMD's open-source software strategy and expedite the deployment of optimized AI solutions on their high-performance platforms, including AMD Instinct data center accelerators, Ryzen AI processors, EPYC processors, Versal SoCs, and Radeon GPUs. The acquisition aligns with AMD's broader AI growth strategy, aiming to provide an open software ecosystem that simplifies AI model deployment for customers through developer tools, libraries, and models. Vamsi Boppana, senior vice president, Artificial Intelligence Group at AMD, reportedly remarked, The acquisition of Nod.ai is expected to significantly enhance our ability to provide AI customers with open software that allows them to easily deploy highly performant AI models tuned for AMD hardware. The addition of the talented Nod.ai team accelerates our ability to advance open-source compiler technology and enable portable, high-performance AI solutions across the AMD product portfolio. Nod.ai’s technologies are already widely deployed in the cloud, at the edge and across a broad range of end-point devices today. [Source – Globe Newswire] Nod.ai, known for delivering optimized AI solutions to top hyperscalers, enterprises, and startups, brings its SHARK software, which automates compiler-based optimization. This software minimizes the need for manual fine-tuning, reducing the time required to deploy high-performance AI models across a wide range of data center, edge, and client platforms utilizing AMD CDNA, XDNA, RDNA, and ‘Zen’ architectures. The acquisition reflects AMD's continuous commitment to innovation in high-performance computing, graphics, and visualization technologies. AMD seeks to provide adaptive products that cater to a broad range of industries and applications. It's important to note that this announcement includes forward-looking statements concerning the acquisition's expected benefits and is subject to certain risks and uncertainties. Investors are advised to review AMD's Securities and Exchange Commission filings for a detailed understanding of these risks and uncertainties. Acquiring open-source AI technology may introduce dependence on community support and expertise, potentially leading to security concerns and limited official assistance. Integrating the new software can also result in compatibility issues and market competition in the fiercely contested AI tech sector. However, the acquisition of Nod.ai enhances AMD's AI capabilities, streamlining the deployment of high-performance AI solutions. Embracing an open software strategy lowers entry barriers, and Nod.ai's automation reduces manual optimization needs, enabling deployment across diverse platforms while aligning with AMD's innovation focus.

Read More

AI Tech

Forethought’s Autoflows: AI-Driven Revolution in Customer Support

Forethought | September 25, 2023

Forethought, a leader in generative AI for customer support, has introduced Autoflows, a groundbreaking autonomous resolution capability for SupportGPT. This innovation marks a significant step towards a more efficient, goal-oriented, AI-centric future in customer support. In the fast-evolving landscape of customer support, AI is positioned to revolutionize the way businesses interact with their clients. However, many large enterprises, despite claiming to be AI-first, still rely on outdated manual workflows rooted in the CRM era. These workflows consume hours of valuable time and often lead to subpar performance, dissatisfied customers, and delayed value realization. Forethought aims to transform this outdated approach into a system of intelligence, redefining what it means to be truly AI-first. Deon Nicholas, CEO and Co-Founder of Forethought, reportedly emphasized, Automation over the past decade has focused heavily on rules and tasks and building manual workflows. But the manual workflow is the Achilles heel of the AI-first future. [Source – Businesswire] Autoflows empower customer support leaders to define desired issue resolutions in plain, natural language, eliminating the need for complex decision trees or predefined rulesets. Leveraging AI, Autoflows accurately predicts user needs and determines efficient steps to achieve these desired outcomes. This efficiency allows support agents to redirect their focus towards more complex issues, transcending simplistic task-oriented processes. Autoflows promise significant benefits: Enhanced Customer Experience: Autoflows facilitate natural conversations with customers, employing generative AI models trained on real agent responses and CRM data. Improved Performance: These flows autonomously resolve customer issues and take actions in alignment with agent guidelines, brand preferences, and user prompts. Reduced Time to Value: Autoflows can be created in minutes using natural language, a stark contrast to the days it typically takes to construct intricate decision trees. Brent Pliskow, GM & VP of Customer Support at Upwork, expressed his appreciation for the innovation. He mentioned that they are constantly seeking to incorporate new generative AI capabilities into their customer support processes, with the goal of improving the customer experience and enhancing internal efficiency. He noted that, when they replaced specific manual workflows with Autoflows, they observed significant time savings compared to the traditional approach of building workflows. Additionally, he reported an increase of up to 27% in customer satisfaction in these particular use cases. About Forethought Founded in 2018, Forethought is a prominent generative AI company specializing in customer service automation. Its solutions seamlessly integrate generative AI powered by large language models (LLMs) to enhance support team efficiency. It offers instant case resolution, predictive case prioritization, and agent assistance, all within a single platform. With $90 million in venture capital funding and accolades like G2's Best Software Products for 2023, Forethought is a leader in the tech industry, headquartered in San Francisco, California.

Read More

AI Tech

IBM Expands Relationship with AWS to Bring Generative AI Solutions and Dedicated Expertise to Clients

PR Newswire | October 20, 2023

IBM (NYSE: IBM) today announced an expansion of its relationship with Amazon Web Services (AWS) to help more mutual clients operationalize and derive value from generative artificial intelligence (AI). As part of this, IBM Consulting aims to deepen and expand its generative AI expertise on AWS by training 10,000 consultants by the end of 2024; the two organizations also plan to deliver joint solutions and services upgraded with generative AI capabilities designed to help clients across critical use cases. IBM Consulting and AWS already serve clients across a variety of industries with a range of AI solutions and services. Now, the companies are enhancing those solutions and services with the power of generative AI designed to help clients integrate AI quickly into business and IT operations building on AWS. IBM Consulting and AWS plan to start with these specific solutions: Contact Center Modernization with Amazon Connect – IBM Consulting worked with AWS to create summarization and categorization functions for voice and digital interactions using generative AI, which are designed to allow for transfers between the chatbot and live agent and provide the agent with summarized details that expedite resolution times and improve quality management. Platform Services on AWS – Initially introduced in November 2022, this offering is newly upgraded with generative AI to better manage the entire cloud value chain including IT Ops, automation, and platform engineering. The new generative AI capabilities give clients tools to enhance business serviceability and availability for their applications hosted on AWS through intelligent issue resolution and observability techniques. Clients can expect an improvement of uptime and mean time repair which means they can act quickly and effectively to potential issues that arise. Supply Chain Ensemble on AWS – This planned offering will introduce a virtual assistant that can help accelerate and augment the work of supply chain professionals as they aim to deliver on customer expectations, optimize inventories, reduce costs, streamline logistics, and assess supply chain risks. Additionally, for clients looking to modernize on AWS, IBM Consulting plans to integrate AWS generative AI services into its proprietary IBM Consulting Cloud Accelerator to help accelerate the cloud transformation process. This will help with reverse engineering, code generation and code conversion. Commitment to deepening expertise and expanding AWS on watsonx integration IBM has already built extensive expertise with AWS's generative AI services including Amazon SageMaker and Amazon CodeWhisperer, and is one of first AWS Partners to use Amazon Bedrock, a fully managed service that makes industry-leading foundation models (FMs) available through an API, so clients can choose the model that's best suited for their use case. AI expertise and a deep understanding of AWS capabilities are critical for clients looking to implement generative AI, and IBM is already providing mutual clients with access to professionals from IBM Consulting's Center of Excellence for Generative AI with specialized generative AI expertise. With today's news, IBM Consulting plans to train and skill 10,000 consultants on AWS generative AI services by end of 2024. They will have access to an exclusive, partner-only program that provides training on the top use cases and best practices for client engagement with AWS generative AI services. This will help advance their knowledge, allow them to engage with technical professionals and better serve clients innovating on AWS. Enterprise clients are looking for expert help to build a strategy and develop generative AI use cases that can drive business value and transformation – while mitigating risks, said Manish Goyal, Senior Partner, Global AI & Analytics Leader at IBM Consulting. Paired with IBM's AI heritage and deep expertise in business transformation on AWS, this suite of reengineered solutions with embedded generative AI capabilities can help our mutual clients to scale generative AI applications rapidly and responsibly on their platform of choice. IBM is also responding to client demand for generative AI capabilities on AWS by making watsonx.data, a fit-for-purpose data store built on an open lakehouse architecture, available on AWS as a fully managed software-as-a-service (SaaS) solution– which clients can also access in AWS Marketplace. The company also plans to make watsonx.ai and watsonx.governance available on AWS by 2024. This builds on previous commitments made by the two companies to make it easier for clients to consume IBM data, AI and security software on AWS. "Our customers are increasingly looking for the technical support and AI expertise they need to build and implement a generative AI strategy that drives business value from their entire cloud value chain," said Chris Niederman, Managing Director, Global Systems Integrators at AWS. "We are excited to be working with IBM to include embedded generative AI capabilities that assist our mutual customers scale their applications – and help IBM consultants deepen their expertise on best practices for customer engagement with AWS generative AI services." Generative AI at scale for telecommunications Clients are already benefitting from the longstanding relationship between IBM and AWS. Bouygues Telecom, a leading French communications service provider with a history of industry-leading innovation, engaged IBM Consulting to support the company's evolving cloud strategy to explore, design and implement AI use cases at scale while giving teams flexibility to select cloud and AI providers based on departmental and application needs. Leveraging the IBM Garage approach, the team co-designed a custom data and AI reference architecture covering multiple cloud scenarios that can extend to all AI and data projects across Bouygues Telecom's cloud and on-premises platforms. "As we sought to leverage generative AI to extract insights from our engagements with clients, we were confronted with some unfamiliar issues around storage, memory size and power requirements," said Matthieu Dupuis, Head of AI for Bouygues Telecom. "IBM Consulting and AWS have been invaluable partners in identifying the right model for our needs and overcoming these technological barriers." With the new AI platform on AWS, IBM Consulting enabled Bouygues Telecom to develop proof-of-concept models and scale them into production quickly while helping to minimize costs and risks. The platform also enables their data scientists to work with greater efficiency, purpose, and satisfaction by allowing them to spend more time on complex, high-value AI projects rather than launching standalone solutions. An evolving relationship With over 40 years of combined experience on AI solutions, IBM and AWS have been working together to respond to clients who are looking to leverage AI for cost, efficiency, and growth, whether they are looking for a demonstration of the technology, defining potential use cases or full co-creation of bespoke solutions. Additionally, IBM is an AWS Premier Tier Services Partner with over 22,000 AWS certifications globally and has achieved 17 AWS Service Delivery and 16 AWS Competency designations. Today's news builds on this longstanding relationship and a shared value of the importance of enterprise AI. Getting to enterprise AI at scale requires a human-centric, principled approach, and IBM Consulting helps clients establish guardrails that align with the organization's values and standards, mitigate bias and manage data security, lineage, and provenance. IBM Consulting accelerates business transformation for our clients through hybrid cloud and AI technologies, leveraging our open ecosystem of partners. With deep industry expertise spanning strategy, experience design, technology, and operations, we have become the trusted partner to many of the world's most innovative and valuable companies, helping modernize and secure their most complex systems. Our 160,000 consultants embrace an open way of working and apply our proven co-creation method, IBM Garage, to scale ideas into outcomes. Statements regarding IBM's future direction and intent are subject to change or withdrawal without notice, and represent goals and objectives only.

Read More

AI Tech

AMD Enhances AI with Nod.ai Acquisition for Open-Source Solutions

AMD | October 11, 2023

AMD acquires Nod.ai to boost open-source AI solutions. AMD's Senior Vice President expects smoother AI deployment with Nod.AI. Nod.AI's SHARK software speeds up AI model deployment, in line with AMD's innovation focus. Advanced Micro Devices (AMD) has announced its definitive agreement to acquire Nod.ai, a leading open-source AI software expert. This strategic move is set to bolster AMD's open-source software strategy and expedite the deployment of optimized AI solutions on their high-performance platforms, including AMD Instinct data center accelerators, Ryzen AI processors, EPYC processors, Versal SoCs, and Radeon GPUs. The acquisition aligns with AMD's broader AI growth strategy, aiming to provide an open software ecosystem that simplifies AI model deployment for customers through developer tools, libraries, and models. Vamsi Boppana, senior vice president, Artificial Intelligence Group at AMD, reportedly remarked, The acquisition of Nod.ai is expected to significantly enhance our ability to provide AI customers with open software that allows them to easily deploy highly performant AI models tuned for AMD hardware. The addition of the talented Nod.ai team accelerates our ability to advance open-source compiler technology and enable portable, high-performance AI solutions across the AMD product portfolio. Nod.ai’s technologies are already widely deployed in the cloud, at the edge and across a broad range of end-point devices today. [Source – Globe Newswire] Nod.ai, known for delivering optimized AI solutions to top hyperscalers, enterprises, and startups, brings its SHARK software, which automates compiler-based optimization. This software minimizes the need for manual fine-tuning, reducing the time required to deploy high-performance AI models across a wide range of data center, edge, and client platforms utilizing AMD CDNA, XDNA, RDNA, and ‘Zen’ architectures. The acquisition reflects AMD's continuous commitment to innovation in high-performance computing, graphics, and visualization technologies. AMD seeks to provide adaptive products that cater to a broad range of industries and applications. It's important to note that this announcement includes forward-looking statements concerning the acquisition's expected benefits and is subject to certain risks and uncertainties. Investors are advised to review AMD's Securities and Exchange Commission filings for a detailed understanding of these risks and uncertainties. Acquiring open-source AI technology may introduce dependence on community support and expertise, potentially leading to security concerns and limited official assistance. Integrating the new software can also result in compatibility issues and market competition in the fiercely contested AI tech sector. However, the acquisition of Nod.ai enhances AMD's AI capabilities, streamlining the deployment of high-performance AI solutions. Embracing an open software strategy lowers entry barriers, and Nod.ai's automation reduces manual optimization needs, enabling deployment across diverse platforms while aligning with AMD's innovation focus.

Read More

AI Tech

Forethought’s Autoflows: AI-Driven Revolution in Customer Support

Forethought | September 25, 2023

Forethought, a leader in generative AI for customer support, has introduced Autoflows, a groundbreaking autonomous resolution capability for SupportGPT. This innovation marks a significant step towards a more efficient, goal-oriented, AI-centric future in customer support. In the fast-evolving landscape of customer support, AI is positioned to revolutionize the way businesses interact with their clients. However, many large enterprises, despite claiming to be AI-first, still rely on outdated manual workflows rooted in the CRM era. These workflows consume hours of valuable time and often lead to subpar performance, dissatisfied customers, and delayed value realization. Forethought aims to transform this outdated approach into a system of intelligence, redefining what it means to be truly AI-first. Deon Nicholas, CEO and Co-Founder of Forethought, reportedly emphasized, Automation over the past decade has focused heavily on rules and tasks and building manual workflows. But the manual workflow is the Achilles heel of the AI-first future. [Source – Businesswire] Autoflows empower customer support leaders to define desired issue resolutions in plain, natural language, eliminating the need for complex decision trees or predefined rulesets. Leveraging AI, Autoflows accurately predicts user needs and determines efficient steps to achieve these desired outcomes. This efficiency allows support agents to redirect their focus towards more complex issues, transcending simplistic task-oriented processes. Autoflows promise significant benefits: Enhanced Customer Experience: Autoflows facilitate natural conversations with customers, employing generative AI models trained on real agent responses and CRM data. Improved Performance: These flows autonomously resolve customer issues and take actions in alignment with agent guidelines, brand preferences, and user prompts. Reduced Time to Value: Autoflows can be created in minutes using natural language, a stark contrast to the days it typically takes to construct intricate decision trees. Brent Pliskow, GM & VP of Customer Support at Upwork, expressed his appreciation for the innovation. He mentioned that they are constantly seeking to incorporate new generative AI capabilities into their customer support processes, with the goal of improving the customer experience and enhancing internal efficiency. He noted that, when they replaced specific manual workflows with Autoflows, they observed significant time savings compared to the traditional approach of building workflows. Additionally, he reported an increase of up to 27% in customer satisfaction in these particular use cases. About Forethought Founded in 2018, Forethought is a prominent generative AI company specializing in customer service automation. Its solutions seamlessly integrate generative AI powered by large language models (LLMs) to enhance support team efficiency. It offers instant case resolution, predictive case prioritization, and agent assistance, all within a single platform. With $90 million in venture capital funding and accolades like G2's Best Software Products for 2023, Forethought is a leader in the tech industry, headquartered in San Francisco, California.

Read More

Events

DevOps Vision

Conference

AI.dev

Conference

DevOps Vision

Conference

AI.dev

Conference