This issue of Digitalisation World includes significant coverage of the first anniversary of GDPR, open source technology and collaboration. If there’s one theme that links these topic together, it must be that of trust. When it comes to handing over data, private individuals and corporations need to be able to trust that this information will not be misappropriated nor used for purposes for which it was never intended. Despite many example of data breaches and data misuse, most citizens still seem happy to hand over data concerning their lifestyles.
When it comes to collaboration and open source, both of these practices rely on the idea that everyone and every company, from huge great corporations right down to the most insignificant of individuals, all have a role to play in working together, and sharing information, hopefully for the good of all. As above, the key element is the trust required to work with others to develop solutions – solutions that are freely available to all, and not ‘hijacked’ along the line by proprietary technology.
And while we’re on the subject of trust, the ongoing furore concerning Huawei’s fitness, or otherwise, to supply 5G technology to the UK, is just one more example of an issue of trust. Is the Chinese government spying on us all, as Mr Trump suggests? Is he just indulging in yet more fake news to harm Huawei’s business prospects, and bolster those of US tech companies? And do we trust Facebook and Google any more, or less, than we do Huawei? And does any of it really matter anyway – after all, many citizens believe that, if a government wants to find out something about any individual, corporation, or other country, it will find a way to do so – enter Big Brother.
As intelligent automation develops, and data becomes the new oil (apologies for using this expression, but it does make the point!), levels of trust will have to increase and/or be stretched to breaking point if we are to benefit from all that AI and the like has to offer.
Or will there be a kick back at some point? Will the time come when the majority says ‘enough is enough’ and individual privacy regains its importance? I suspect not if the millennials have anything to do with it!
Study tracks the progress of European businesses towards empowering their workforce with the latest technological innovations.
Dell Technologies and VMware have unveiled the findings from a survey, the results of which are published in the IDC Executive Summary, Becoming “ Future of Work” Ready: Follow the Leaders, sponsored by Dell Technologies and VMWare. The study is focused on the adoption of the latest technological innovations in European businesses. The results reveal that only 29% of European organisations have successfully established a Future of Work strategy - a holistic and integrated approach to empower a company and its workers with the latest innovations and concepts.
The study surveyed full-time employees from small, medium and large businesses across the Czech Republic, France, Germany, Italy, Poland, Spain and the United Kingdom. In the businesses identified as “Future of Work determined organisations” (FDOs – companies that have established a Future of Work strategy), the top initiative currently underway (50%) is the implementation of training programs to bring employees up to speed with the latest digital skill requirements. Almost half of FDOs consider employee productivity as a critical driver in the transformation of their workplace.
However, driving the workplace of the future isn’t just the improvement of digital skills but also the working environment itself. 46% agreed that redesigning the office space for smarter working is an integral programme presently taking place, and France is leading the way with well over half of organisations in the country driving renovations.
Working styles have continued to evolve, and European FDOs have acknowledged this and are adapting. 48% of European businesses have created security policiesthat benefit contemporary working styles such as flexible and remote working. The UK is leading the chargewith two-thirds of Britain-based companies demonstrating a strong resolve to adapt to the ever-changing needs of the European workforce.
“There are great examples of companies who have adopted a holistic approach to the Future of Work and their success highlights the importance of this approach to today’s workforce. More European companies need to consider this enterprise-wide strategy,” says Therese Cooney, Senior Vice President, Client Solutions Group, Dell Technologies. “The future workplace shouldn’t be created to solely fit the needs of the company, but also the people who drive it. We need to equip employees with the right digital skills, technology and security safeguards in an environment which helps them grow and succeed with improved collaboration, productivity and flexibility.”
“Prospective employees today are more selective than ever when it comes to deciding where they want to work, which means companies need to transform their workplaces to attract, retain and empower top talent,” says Kristine Dahl Steidel, Vice President, End User Computing, VMware EMEA. “Employees are at the heart of the digital transformation that is changing the future of work and companies that provide an employee experience that boosts flexibility, mobility and productivity are managing to increase their performance and overall success.”
Study highlights
While this isn’t a detrimental issue, companies should still keep in mind that digital transformation shouldn’t run the risk of being left on the wayside but instead be built into their overall business plans for success.
The numerous positive benefits surrounding cloud-based platforms can help simplify device management and the level of scalability for an evolving business.
Eight in ten senior business and government leaders say digital competencies are either very or extremely important to achieving their organisational goals according to a new Economist Intelligence Unit (EIU) survey, commissioned by Riverbed Technology.
Digital competencies have become vital to achieving business goals, according to new research by the Economist Intelligence Unit (EIU). In Benchmarking competencies for digital performance,commissioned by Riverbed, eight in ten respondents see digital competencies as being either very or extremely important in achieving revenue growth, service quality, mission delivery, profit growth/cost reduction, and customer satisfaction.
The study is based on a survey of more than 500 senior business and government leaders across the world, including the UK, focused on assessing nine behaviours, skills and abilities that help organisations improve their digital performance and, ultimately, achieve their objectives. Accompanying the study is a digital competency assessment tool, which enables users to benchmark their organisation’s competencies and performance against all survey respondents. The tool can be accessed here.
The survey uncovers a shared awareness among businesses that digital transformation is necessary to achieve their goals and remain competitive. Yet, more than half of organisations say they are struggling to achieve these important goals because they lack digital competencies. In particular, 65% of respondents say that their digital-competency gaps have negatively affected user experience, which explains why almost half of respondents say they need to improve digital experience management.
The central importance that companies place on improved digital competency comes despite the fact that some firms are yet to achieve meaningful results. About a third of organisations surveyed report only neutral or no measurable benefits from their digital strategies. The issues appear especially problematic in the public sector, with 60% of private-sector respondents describing their IT modernisation/transformation as advanced, compared with only 45% in the public sector.
In terms of overcoming this capability gap, the IT function plays a pivotal role. Organisations are aware that IT must be agile, as 78% of high performers globally cite IT infrastructure modernisation and transformation as their top digital competency for achieving their goals. In addition, enabling greater communication and collaboration between IT and the rest of the organisation (where digital competencies may be scarce) can significantly enhance digital performance and user experience.
Robert Powell, Editorial Director of EIU Thought Leadership (Americas), says: “The study shows a clear consensus among respondents that improving digital competency is vital for boosting organisational performance, even if some are not yet witnessing the results. Nevertheless, among the highest performing, the lessons are clear—do not hesitate, encourage internal collaboration, and, even if you feel ahead of your competition, never stop looking over your shoulder.”
Paul Higley, Vice President, Northern Europe, at Riverbed Technology comments, “The survey results support what we’re hearing from businesses and government leaders across the UK region. It’s time to start addressing the digital-skills gaps in order to fully deliver on digital transformation and build a workforce that will drive creativity, innovation and growth. The findings also highlight that forward-thinking organisations must prioritise investments in tools to measure, monitor, and improve the end-user experience if they are to stay ahead of their competition. Further to this, developing digital skills programs and modernising IT infrastructure are key areas of development to maximise digital performance.”
Digital transformation often fails because of an inability to onboard suppliers and poor user adoption.
Ivalua has published the findings of a worldwide survey of supply chain, procurement and finance business leaders, on the status of their digital transformations, the obstacles encountered and keys to success.
The research, conducted by Forrester Consulting and commissioned by Ivalua, used a digital maturity index to assess organizations’ structure, strategy, process, measurement and technology to determine the true level of digital maturity. It found that most organizations are significantly overestimating their maturity. Only 16% of businesses had an advanced level of digital maturity in procurement, giving them a source of competitive advantage over rivals, though 65% assessed themselves as advanced.
“Procurement leaders have the opportunity to deliver a true competitive advantage for their organizations,” said David Khuat-Duy, Corporate CEO of Ivalua. “Digital transformation is critical to success, but requires a realistic assessment of current maturity, a clear vision for each stage of the journey and the right technology.”
Obstacles differ significantly based on the stage of transformation, indicating a need to assess technology based on both current needs and future ones. Early in the journey, lack of budget and executive support are primary obstacles, while more advanced organizations struggled with poor integration across their source-to-pay systems. As a result, advanced organizations were most likely (60%) to be planning to implement a full ePurchasing suite.
The study revealed that organizations frequently make poor choices with regards to technology, which impedes digital transformation. Over three-quarters (82%) switched or are considering switching technology providers. The primary reasons for switching are poor levels of supplier onboarding (30%) and poor user adoption (27%). Onboarding suppliers quickly is critical for any technology adoption, yet just 17% of organizations are able to onboard new suppliers in less than one month, with 59% taking 1-3 months per supplier.
“To ensure that technology empowers procurement transformation, rather than constrains it, leaders must consider their current and future requirements when evaluating options,” added David Khuat-Duy. “Doing so ensures a steady progression along their journey and the ability to gain an edge on competitors.”
Research finds inadequate access to skilled talent, technology, and data is holding AI initiatives back.
Most organisations are fully invested in AI but more than half don’t have the required in-house skilled talent to execute their strategy, according to new research from SnapLogic. The study found that 93% of US and UK organisations consider AI to be a business priority and have projects planned or already in production. However, more than half of them (51%) acknowledge that they don’t have the right mix of skilled AI talent in-house to bring their strategies to life. Indeed, a lack of skilled talent was cited as the number one barrier to progressing their AI initiatives, followed by, in order, lack of budget, lack of access to the right technology and tools, and lack of access to useful data.
The new research, conducted by Vanson Bourne on behalf of SnapLogic, studied the views and perspectives of IT decision makers (ITDMs) across several industries, asking key questions such as: where is your organisation in its AI/ML journey, what are the top barriers your organisation is facing when executing your AI initiatives, does your organisation have employees in-house with the required skillset to execute your strategy, and what are the top skills and attributes you are looking for in your AI team?
Where are organisations in their AI/ML journey?
When asked where organisations are in their AI/ML journey, most (93%) ITDMs claim to be fully invested in AI. Nearly three-quarters (74%) of organisations in the US and UK haveinitiated an AI project during the past three years, with the US leading the UK at 78% compared to 66% uptake.
Looking at specific industry sectors, the financial services industry is most progressive with 80% having current AI projects in place, followed closely by the retail, distribution and transport sector (76%) and the business and professional services sector (72%). Surprisingly, the IT industry was found to be among the least progressive in AI uptake with 70% having projects actively in place.
Key barriers holding AI initiatives back
Despite strong levels of AI uptake, organisations are being held back by significant barriers. More than half (51%) of ITDMs in the US and UK do not have the right in-house AI talent to execute their strategy. In the UK, this in-house skill shortage is considerably more acute, with 73% lacking the needed talent compared to 41% in the US.
In both the US and UK, manufacturing and IT are challenged the most from this in-house talent shortage. In the UK, 69% of manufacturing organisations and 56% in those in the IT sector raise lack of in-house talent as the top barrier. Likewise, in the US, those two sectors face similar challenges, with 50% in manufacturing and 41% in the IT industry citing lack of in-house talent as the primary barrier.
Behind lack of access to skilled talent, ITDMs in the US and UK also consider a lack of budget (32%) to be a key issue holding them back, followed by a lack of access to the right technologies and tools (28%), as well as access to useful data (26%).
Building the right AI team
Interestingly, the priority skills and attributes that organisations are looking for in their AI team are coding, programming and software development (35%), with data visualisation and analytics considered to be a priority by 33% of ITDMs. An understanding of governance, security and ethics is also considered a necessary skill (34%). Just over a quarter of ITDMs (27%) are looking for talent with an advanced degree in a field closely related to AI/ML.
To build the right AI team, an impressive 68% said they are investing in retraining and upskilling existing employees. Nearly 58% of ITDMs indicated they are identifying and recruiting skilled talent from other companies and organisations, while almost half (49%) believe that recruiting from universities is important to getting an effective AI team in place.
Gaurav Dhillon, CEO at SnapLogic, commented “The AI uptake figures are very encouraging, but key barriers to execution remain in both the US and UK. For organisations to accelerate their AI initiatives, they must upskill and recruit the right talent and invest in new technology and tools. Today’s self-service and low-code technologies can help bridge the gap, effectively democratising AI and machine learning by getting these transformative capabilities into the hands of more workers at every skill level and thus moving the modern enterprise into the age of automation.”
LogMeIn has published the results of a new study conducted by Forrester Consulting to determine how customer experience strategies affects overall business success.
The study, Build Competitive Advantage Through Customer Engagement and AI, surveyed 479 global customer engagement decision makers and found that organisations with a more mature strategy – including those who make Customer Experience (CX), an organisational priority and leverage omni-channel and Artificial Intelligence (AI) technologies – see an increase in revenue and conversion at double the rate of other companies. The results also show that as the maturity gap continues to widen, organisations that are falling behind may never be able to catch-up to their more mature competitors.
“Exceptional customer experience is a cornerstone of business success. Better customer engagement leads not only to higher customer satisfaction, but also to greater top-line revenue growth and more satisfied customer-facing employees,” according to the study. Organisations with greater engagement maturity reap benefits, not only more often, but of greater value, than less mature companies.”
The Impact of Technology
Emerging technologies like artificial intelligence are also accelerating the divide. Companies with a more mature engagement approach can more quickly adapt and incorporate powerful use cases of AI that propel them forward. 36% of the least mature respondents use AI, but only in proofs of concepts. Meanwhile 58% of CX “experts” have implemented a holistic AI strategy and roadmap. The comparison between long-term and short-term strategies of these organisations represent a significant set-back for the less mature companies who have not been able to capitalise on the business intelligence that AI-powered technology can provide.
Challenges Facing Less Mature Organisations
The study showed that 37% of less mature organisations rely too much on obsolete technology – especially in the area of digital channel support. Furthermore, poor self-service and automation capabilities are leading to frustrated customers and increased call centre volume – which conversely is not a typical challenge for the most mature of the group who are successfully leveraging these capabilities to create better overall customer experiences. Additional challenges that less mature organisations run into include lack of visibility into both customer data (37%) and the performance of engagement channels (42%). These limited views inhibit a company’s ability to quickly address weaknesses and understand how to best serve their customers.
Measurable Impact
For the most mature of those surveyed, 63% saw an increase in NPS as a result of their current customer engagement strategies and reported an average of 8 points higher than their lesser mature counterparts. Further, half of these organisations saw an increase in conversation rates, 56% reported an increase in revenue and 40% saw an increase in order size. Even agent satisfaction increased under the more mature organisations with nearly 50% reporting an increase in overall job happiness.
“With all of the hype around AI’s place in customer experience, it can be hard for companies to separate fact from fiction,” said Ryan Lester, Senior Director of Customer Engagement Technologies at LogMeIn. “The results of this study helped provide some clarity around the importance of continuing to evolve customer engagement strategies. Technologies like AI are creating a significant competitive advantage for leaders and leaving the rest falling irreparably behind.”
Splunk has released research that shows organisations are ignoring potentially valuable data and don’t have the resources they need to take advantage of it. The research reveals that although business executives recognise the value of using all of their data, more than half (55 percent) of an organisation’s total data is “dark data,” meaning they either don’t know it exists or don’t know how to find, prepare, analyse or use it.
The State of Dark Data Report, built using research conducted by TRUE Global Intelligence and directed by Splunk, surveyed more than 1,300 global business managers and IT leaders about how their organizations collect, manage and use data. In an era where data is connecting devices, systems and people at unprecedented growth rates, the results show that while data is top of mind, action is often far behind.
·76 percent of respondents surveyed across the U.S., U.K., France, Germany, China, Japan, and Australia agree “the organization that has the most data is going to win.”
·60 percent of respondents said that more than half of their organizations’ data is dark, and one-third of respondents say more than 75 percent of their organization’s data is dark.
·Business leaders say their top three obstacles to recovering dark data is the volume of data, followed by the lack of necessary skill sets and resources.
·More than half (56 percent) admit that “data-driven” is just a slogan in their organization.
·82 percent say humans are and will always be at the heart of AI.
“Data is hard to work with because it’s growing at an alarming rate and is hard to structure and organise. So, it’s easy for organisations to feel helpless in this chaotic landscape,” says Tim Tully, chief technology officer, Splunk. “I was pleased to see the opportunity people around the world attach to dark data, even though fewer than a third of those surveyed say they have the skills to turn data into action. This presents a tremendous opportunity for motivated leaders, professionals and employers to learn new skills and reach a new level of results. Splunk can help those organizations feel empowered to take control of identifying and using dark data.”
Respondents are Slow to Seize Career and Leadership Opportunities
While respondents understand the value of dark data, they admit they don’t have the tools, expertise or staff to take advantage of it. Plus, the majority of senior leaders say they are close enough to retirement that they aren’t motivated to become data-literate. Data is the future of work, but only a small percentage of professionals seem to be taking it seriously. Respondents agree there is no single answer, though the top solutions having potential included training more employees in data science and analytics, increasing funding for data wrangling, and deploying software to enable less technical employees to analyze the data for themselves.
·92 percent say they are “willing” to learn new data skills but only 57 percent are “extremely” or “very” enthusiastic to work more with data.
·69 percent said they were content to keep doing what they’re doing, regardless of the impact on the business or their career.
·More than half of respondents (53 percent) said they are too old to learn new data skills when asked what they were doing to educate themselves and their teams.
·66 percent cite lack of support from senior leaders as a challenge in gathering data and roughly one-in-five respondents (21 percent) cite lack of interest from organization leaders as a challenge.
AI is Believed to Be The Next Frontier for Data-Savvy Organizations
Globally, respondents believe AI will generally augment opportunities, rather than replace people. While the survey revealed that few organizations are using AI right now, a majority see its vast potential. For example, in a series of use cases including operational efficiency, strategic decision making, HR and customer experience, only 10 to 15 percent say their organisations are deploying AI for these use cases while roughly two-thirds see the potential value.
·A majority of respondents (71 percent) saw potential in employing AI to analyze data.
·73 percent think AI can make up for the skills gaps in IT.
·82 percent say humans are and will always be at the heart of AI and 72 percent say that AI is just a tool to solve business problems.
·Only 12 percent are using AI to guide business strategy and 61 percent expect their organisation to increase its use of AI this way over the next five years.
Regional Differences
There are some key differences in the UK specific results. For example, 39 percent of people in the United Kingdom believe AI can make up for the skills gap versus only 27 percent globally. UK employees are also the most likely in the world to say they need to learn more data skills in order to get promoted again, 83 percent compared to the global figure of 76 percent. Additional UK specific results include:
·The UK often comes second only to China in its enthusiasm for data and AI, and its belief in the importance of data skills
·67 percent of UK companies agree “data-driven” is just a slogan at their organization, compared with only 56 percent globally
·The majority of respondents in the UK market (61 percent) report understanding AI extremely or very well — one of only two markets in which a majority make that claim (the other is China, at 77 percent). The global average is 48 percent
The European Managed Services Summit in May revealed a sector full of optimism, facing up to its challenges as a mature industry. A wealth of new research emerged at the same time, reinforcing the ideas from the events in London and Amsterdam last year that providers needed to specialise, and also work harder with their customers.
First out of the gate was Datto, whose annual managed services report is an eclectic mix of revelations as to the work-life pressures on a sector which faces continuous engagement with its customers, plus the news that half the MSPs have now been in business more than sixteen years. So this is now a mature industry with a mature industry’s outlook; but such problems as it faces are those of progress and growth. In the study, nearly 100% of the MSPs surveyed state that now is as good a time as ever to be in their industry.
At the Amsterdam Summit, Jason Howells, director of the international MSP business at Barracuda MSP presented a session based on his MSPDay research and also had some insights into the state of the market. One of the things that came out quite strongly was how fast things are changing: “Yes, I think this is a mature industry, but at the same time we've got a lot of new entrants - a lot of them coming out of customers and coming out of resellers,” he says.
“I think it's extremely fast-paced at the moment - not that the industry has never not been fast-paced. The MSP space in particular now is getting so much attention and investment it means that things are probably moving faster.”
“What we are seeing is demand from the end-user. That means that whether you are a reseller or a managed service provider, if you're not evolving your own businesses to adapt, then you're probably not going to survive. It will continue into a situation where most, if not all of the IT channel, will be providing monthly services.”
Managed Services Providers are facing an increasingly complex and competitive customer set, advised Mark Paine from researcher Gartner in the first keynote in Amsterdam. The complexity of the customer buying process, with many individuals involved at various levels has increased the length of the sales cycle to double that of customers’ expectations, though recent evidence shows it shortening again as MSPs get more sophisticated, he told the event
The trust that both MSPs and customers look to achieve is possible, but IT service suppliers need to work on their authenticity. “Find out who your prospects trust and make sure they know about you,” he says, “while you work to reposition your company and create authentic stories.”
The issue of trust also came up in the Barracuda MSPDay report referred to earlier, which asked what MSPs thought about customer relationships. This showed that customer misconception was identified as a barrier by the largest group of channel partners (89% of the sample). This seems to result from a view among customers that managed services provision removes their responsibilities for their networks, security and disaster recovery. “MSPs need to spell out the limits of their responsibilities, and the industry as a whole should work to educate its customer base on what is possible,” says Jason Howells.
He reported that network management had risen to the top of the list of services offered by MSPs, with email second, again based on security and reliability issues. And in a change from recent reports of high levels of competition in the market, the MSPs were positive: 73 percent concede there are “still plenty of opportunities out there.”
Which is all good news for MSPs, though in such a fast-changing industry, they need to be on top of trends and new offerings. The next snapshot of the industry will be revealed at the London Managed Services & Hosting Summit 2019 on 18th September (https://mshsummit.com/), and at the similar Manchester event on 30 October (https://mshsummit.com/north/).
The Managed Services & Hosting Summits are firmly established as the leading Managed Services event for the channel. Now in its ninth year, the London Managed Services & Hosting Summit 2019 aims to provide insights into how managed services continues to grow and change as customer demands expand suppliers into a strategic advisory role, and the pressures for compliance and resilience impact the business model at a time of limited resources. Managed Service Providers, other channels and their suppliers can evolve new business models and relationships but are looking for advice and support as well as strategic business guidance.
The Managed Services & Hosting Summits feature conference session presentations by major industry speakers and a range of sessions exploring both technical and sales/business issues.
More than half of organisations enforce encryption of data on all mobile devices and removable media.
Apricorn has published findings from a survey highlighting the rise in encryption technology post GDPR enforcement. Two thirds (66%) of respondents now hardware encrypt all information as standard, which is a positive step considering over a quarter (27%) noted the lack of encryption as being one of the main causes of a data breach within their organisation.
This is in contrast with last year’s survey where only half enforced encryption of data, or were completely confident in their encrypted data, in transit (52%), in the cloud (52%) and at rest (51%), showing a discernible increase in the use of, and need for, encryption as a key component of the data security process.
Forty one percent of respondents have also noticed an increase in the implementation of encryption in their organisation since GDPR was enforced, and their organisation now requires all data to be encrypted as standard, whether it's at rest or in transit. This demonstrates the significance of encryption in GDPR compliance and the protection of sensitive data and is likely driven by it being specifically recommended in Article 32 of GDPR as a method to protect personal data and in Article 34, where obligations towards breached data subjects are reduced where the breached data is encrypted.
GDPR is clearly making security a board level topic with the C-suite now owning the security budget in eighty six percent of the companies surveyed. Organisations are allocating just under a third (30%) of their IT budget to GDPR compliance, which is huge increase when considered against research commissioned by IBM in 2018 that set the ideal spend on cyber security, in general, at 9.8 to 13.7% of the IT budget.
However, despite last year’s survey finding that ninety eight percent of those who knew that GDPR applied to them forecasting a need to assign further budget and resources after achieving compliance, almost a quarter (24%) of this year’s respondents that claim to be in compliance, believe they do not need to assign any further budget or resources.
Jon Fielding, Managing Director, EMEA Apricorn commented: “With the one year anniversary of GDPR, it’s clear that organisations are getting their houses in order, but there still seems to be a long way to go in terms of education and awareness. Organisations need to be mindful that GDPR is an ongoing process and not just a tick box exercise. The most common ways to maintain compliance are to continue to enforce and update all policies and invest in employee awareness on a regular basis. Additionally, encryption is a key component within the compliance “kit”, helping to lessen the probability of a breach and mitigate any financial penalties and obligations that would apply in the unfortunate event of a breach.”
61% of IT organizations have little to modest confidence to mitigate access security threats, despite a majority significantly increasing their near-term budget.
Pulse Secure has published its “2019 State of Enterprise Secure Access” report, available for download at https:/www.pulsesecure.net/SESA19/. The findings quantify threats, gaps and investment as organizations face increasing hybrid IT access challenges. The survey of large enterprises in the US, UK and DACH uncovers business risk and impact resulting in a pivot towards extending Zero Trust capabilities to enable productivity and stem exposures to multi-cloud resources, applications and sensitive data.
While enterprises are taking advantage of cloud computing, the survey data showed all enterprises have ongoing data center dependencies. One fifth of respondents anticipate lowering their data center investment, while more than 40% indicated a material increase in private and public cloud investment. According to the report, “the shift in how organizations deliver Hybrid IT services to enable digital transformation must also take into consideration empowering a mobile workforce, supporting consumer and IoT devices in the workplace and meeting data privacy compliance obligations – all make for a challenging environment to ensure, monitor and audit access security.
"What was consistent across enterprise sizes, sectors, or location was that secure access for hybrid IT is a current and growing concern with cyberthreats, requirements and issues emerging from many sources. The reporting findings and insights should empower corporate leadership and IT security professionals to re-think how their organizations are protecting resources and sensitive data as they migrate to the cloud," said Martin Veitch, editorial director at IDG Connect.
Key Findings
The survey found the most impactful incidents were contributed by a lack of user and device access visibility and lax endpoint, authentication and authorization access controls. Over the last 18 months, half of all companies dealt with malware, unauthorized/vulnerable endpoint use and mobile or web apps exposures. Nearly half experienced unauthorized access to data and resources due to insecure endpoints and privileged users, as well as unauthorized application access due to poor authentication or encryption controls.
While a third expressed significant confidence, 61% of respondents indicated modest confidence in their security processes, human resources, intelligence and tools to mitigate access security threats. The survey revealed the top access threat mitigation deficiencies:
When survey participants were asked what they perceive as their largest operational gaps for access security, the majority identified hybrid IT application availability; user, device and mobile discovery and exposures; weak device configuration compliance; and inconsistent or incomplete enforcement. Correspondingly, the participants stated that their organizations are stepping up their access security initiatives:
The cited incidents, threat mitigation deficiencies and operational gaps are among reasons for the interest in a Zero Trust approach for access security. A Zero Trust model authenticates, authorizes and verifies users, devices, applications and resources no matter where they reside. It encompasses proving identity, device and security state before and during a transaction; applying a least privilege access closest to the entities, applications and data; and extending intelligence to allow policies to adapt to changing requirements and conditions.
Adding to management complexity, the report also found that organizations employ three or more secure access tools per each of 13 solutions presented in the survey. Larger companies have about 30% more tools than smaller enterprises. Correspondingly, nearly half of respondents were open to exploring the benefits of consolidating their security tools into suites. With the migration to cloud, one tool of interest cited by respondents as being implemented or planned over the next 18 months is Software Defined Perimeter (SDP).
Research Highlights
The independent research for the report, which offers key insights into the current access security landscape and the maturity of defenses, was conducted by IDG Connect. Survey respondents included more than 300 information security decision makers in enterprises with more than 1,000 employees across U.S., U.K. and DACH regions, and covered key verticals including financial services, healthcare, manufacturing and services.
“We are pleased to sponsor the 2019 State of Enterprise Secure Access Report. The independent research provides a useful litmus test for the level of exposure, controls and investment regarding hybrid IT access,” said Scott Gordon, chief marketing officer at Pulse Secure. “The key takeaway from this report is hybrid IT delivery has expanded security risks and necessitates more stringent access requirements. As such, organizations should re-assess their secure access priorities, capabilities and technology as part of their Zero Trust strategy.”
Hyperscale Data Center Market is set to grow from its current market was valued at over USD 20 Billion in 2018 to USD 65 Billion by 2025, according to a new research report by Global Market Insights, Inc.
Rise in the demand for big data and cloud computing solutions in distributed computing environments is expected to drive the hyperscale data center market. Rapid increase in data traffic is the major challenge for organizations to store, manage and retrieve the massively growing data. The companies are adopting cloud computing solutions for its increased benefits such as flexibility, reduced IT expenses, scalability, collaboration efficiency and access to automatic updating and storage of large volume of data.
Rapid increase in business data is encouraging the large enterprises to invest heavily in IT expansion. For instance, in August 2018, Google announced its plan to expand its data center in Singapore to scale up capacity and meet increasing demand for services. In addition, the company will invest USD 600 million to expand its South Carolina facility. The growing popularity of cloud-based infrastructure and investments to expand the product portfolio are contributing in the hyperscale data center market growth.
The demand for rack-based Power Distribution Units (PDUs) is growing rapidly in the hyperscale data center market due to their high availability and high-power ratings features. These PDUs can be incorporated with all types of rackmount equipment without interrupting the power supply. It helps organization to reduce power consumption, thereby enhancing the efficiency of an IT facility. These solutions also help in reducing an organization's carbon footprint. It is being widely adopted by businesses for enhancing the business efficiency.
Increase in adoption of cloud-based services and rapid growth in smartphone and social media users in Asia Pacific is expected to drive the market size. The number of smartphone users is expected to cross 6 billion by 2025 with countries such as India, China, South Korea, Taiwan, and Indonesia being the major contributors. Businesses in the region are adopting data-intensive applications such as IoT, data analytics, AI services which needs high amount of data and large capital investments. Several companies are constructing hyperscale facilities to reduce their capital and operational expenses.
In the hyperscale data center market, IT & telecom sector accounted for over 45% of the industry share and is witnessing high adoption of the large-scale infrastructure facilities owing to the increase in data generation and storage requirement. Telecom operators are offering flexible internet or data plans, which is driving the data traffic. The rapid increase in data generation is encouraging the businesses to implement highly scalable and efficient IT environment with high computing power, thereby accelerating the market growth. Global telecommunication companies are establishing mega infrastructure for catering the widespread customer base.
The Western European and Nordic markets are experiencing high demand for hyperscale data centers owing to the easy availability of renewable energy sources, land for development, tax incentives, strengthening fiber connectivity and reduction in electricity cost. These factors are helping cloud service providers to focus on constructing more hyperscale data center in Western Europe. For instance, In June 2018, Google announced its plan to launch its hyperscale data center in the Netherlands due to the availability of sustainable energy sources which will help to lower the energy costs for data centers.
Key players operating in the hyperscale data center market include Broadcom Ltd., Cavium, Inc., Cisco Systems, Inc., Dell, Inc., Huawei Technologies Co., Ltd., IBM Corporation, Intel Corporation, Lenovo Group Ltd., Microsoft Corporation, NVIDIA Corporation, Sandisk LLC, and Schneider Electric SE, among others. Players in the market are manufacturing business-specific solutions which enables customers to customize solutions based on enterprise requirements.
Over 84% of organisations in Europe use or plan to use digitally transformative technologies, but only a little more than half (55%) claim these deployments are very or extremely secure.
Thales has revealed a growing security gap among European businesses – with almost a third (29%) of surveyed enterprises experiencing a breach last year, and only a little more than half (55%) believe their digital transformation deployments are very or extremely secure. These findings are detailed in the 2019 Thales Data Threat Report – Europe Edition with research and analysis from IDC.
Across Europe, more than 84% of organisations are using or planning to use digitally transformative technologies including cloud, big data, mobile payments, social media, containers, blockchain and Internet of Things (IoT). Sensitive data is highly exposed in these environments: in the UK, almost all (97%) of these organisations state they are using this type of data with digital transformation technologies.
“Across Europe, organizations are embracing digital transformative technologies – while advancing their business objectives, this is also leaving sensitive data exposed,” said Sebastien Cano, senior vice president of cloud protection and licensing activity at Thales. “European enterprises surveyed still do not rank data breach prevention as a top IT security spending priority – focusing more broadly on security best practice and brand reputation issues. Yet, data breaches continue to become more prevalent. These organisations need to take a hard look at their encryption and access management strategies in order to secure their digital transformation journey, especially as they transition to the cloud and strive to meet regulatory and compliance mandates.”
Security confidence challenged in digitally transformative environments
However, not everyone is confident of the security of these environments. Across Europe, only a little more than half (55%) claim their digital deployments are very or extremely secure. The UK is the most confident in its levels of security with two thirds (66%) saying they are very or extremely secure. In Germany, confidence is much lower at 49%.
Multi-cloud security remains top challenge
The most common use of sensitive data within digital transformation is in the cloud. Across Europe, 90% of organisations are using, or will use, all cloud environments this year (Software as a Service, Platform as a Service and Infrastructure as a Service). These deployments do not come without concerns, however. The top three security issues for organisations using cloud were ranked as:
-38% - security of data if cloud provider is acquired/fails;
-37% - lack of visibility into security practises; and,
-36% - vulnerabilities from shared infrastructure and security breaches/attacks at the cloud provider.
Businesses are working hard to alleviate these concerns. Over a third (37%) of organisations see encryption of data with service provider managed encryption keys, detailed architecture and security information for IT and physical security, and SLAs in case of a data breach tied as the most important changes needed to address security issues in the cloud.
Compliance is not a security priority
Despite more than 100 new data privacy regulations, including GDPR, affecting almost all (91%) organisations across Europe, compliance is only seen as a top priority for security spend in the UK by 40% of businesses. Interestingly, 20% of UK businesses failed a compliance audit in the last year because of data security issues. When it comes to meeting data privacy regulations, the top two methods named by respondents working to meet strict regulations are encrypting personal data (47%) and tokenising personal data (23%).
“Clearly there is a significant shift to digital transformation technologies and the issues around data held within these cannot be taken lightly,” said Frank Dickson, program vice president for security products research, IDC. “Data privacy regulations have been hot on the agenda over the past 18 months, with so many coming into force. Organisations are now finding themselves considering the cost of becoming compliant against the risk of potential breaches and the subsequent fines.”
Attack levels are high
One of the most jarring findings of the report is that almost two thirds of organisations across Europe (61%) have encountered a data breach at some stage. The UK fares slightly better than the average for Europe with just over half (54%) of organisations saying they have encountered a breach. However, across Europe 29%, of organisations who have faced a data breach did so in the last year; a shocking one in 10 have suffered a data breach both in the last year and at another time.
Companies obstructed by a lack of technology, processes, training and support from leadership to get the most from their technology spend.
Global organizations are demanding more from their data management investments, despite most estimating that they achieve more than double the amount they invest, finds research from Veritas Technologies, a worldwide leader in enterprise data protection and software-defined storage.
The Value of Data study, conducted by Vanson Bourne for Veritas, surveyed 1,500 IT decision makers and data managers across 15 countries, and reveals that although companies see an average return of $2.18 USD for every $1 USD they invest in improving data management, an overwhelming majority (82 percent) of businesses expect to see an even higher return.
Just 15 percent achieved the ROI they expected to receive, while only 1 percent said the ROI they achieved exceeded expectations.
Businesses admit the key factors preventing them from improving their ROI are a lack of the right technology to support data management (40 percent), a lack of internal processes (36 percent) and inadequate employee engagement or training (57 percent). A third (33 percent) also cited an absence of support from senior management as a barrier to achieving a higher return on data management investment.
“Mismanaging data can cost businesses millions in security vulnerabilities, lost revenues and missed opportunities, but those that invest wisely are seeing the incredible potential of their data estates. Unfortunately, too many are being held back by technological or people-related challenges,” said Jyothi Swaroop, vice president, Product & Solutions, Veritas.
“Organizations must arm themselves with the ability to access, protect and derive insights from their data. By promoting a cultural shift in the way data is managed, which includes buy-in from leadership as well as tools, processes and training, companies can empower employees with full visibility and control of data.”
Take care of your data, and it will take care of you
Organizations that are investing in the proper management of their data say they are already benefiting from their investment and are achieving the objectives they set out to achieve. Respondents ranked increased data compliance, reduced security risks, cost savings, and the ability to drive new revenue streams and market opportunities, as the most attractive benefits of improving data management.
Of the organizations that are investing in looking after their data, four in five (81 percent) say they are already experiencing increased data compliance and reduced data security risks, while 70 percent are seeing reduced costs. Nearly three-quarters (72 percent) are driving new revenue streams or market opportunities as a result of investing in data management.
“As cases of high-profile data breaches and threats of hefty fines for regulatory non-compliance continue to plague the headlines, one of the biggest drivers for investing in data management is to protect their data. But many are also benefitting greatly from the ability to use their data more intelligently. Those that invest in overcoming the barriers to effective data management will reap significant rewards in today’s digital economy,” added Swaroop.
Vertiv, together with technology analyst firm 451 Research, has released the report on the state of 5G, “Telco Study Reveals Industry Hopes and Fears: From Energy Costs to Edge Computing Transformation". The report captures the results of an in-depth survey of more than 100 global telecom decision makers with visibility into 5G and edge strategies and plans. The research covers 5G deployment plans, services supported by early deployments, and the most important technical enablers for 5G success.
Survey participants were overwhelmingly optimistic about the 5G business outlook and are moving forward aggressively with deployment plans. Twelve percent of operators expect to roll out 5G services in 2019, and an additional 86 percent expect to be delivering 5G services by 2021.
According to the survey, those initial services will be focused on supporting existing data services (96 percent) and new consumer services (36 percent). About one-third of respondents (32 percent) expect to support existing enterprise services with 18 percent saying they expect to deliver new enterprise services.
As networks continue to evolve and coverage expands, 5G itself will become a key enabler of emerging edge use cases that require high-bandwidth, low latency data transmission, such as virtual and augmented reality, digital healthcare, and smart homes, buildings, factories and cities.
However, illustrating the scale of the challenge, the majority of respondents (68 percent) do not expect to achieve total 5G coverage until 2028 or later. Twenty-eight percent expect to have total coverage by 2027 while only 4 percent expect to have total coverage by 2025.
“While telcos recognise the opportunity 5G presents, they also understand the network transformation required to support 5G services,” said Martin Olsen, vice president of global edge and integrated solutions at Vertiv. “This report brings clarity to the challenges they face and reinforces the role innovative, energy-efficient network infrastructure will play in enabling 5G to realise its potential.”
To support 5G services, telcos are ramping up the deployment of multi-access edge computing (MEC) sites, which bring the capabilities of the cloud directly to the radio access network. Thirty-seven percent of respondents said they are already deploying MEC infrastructure ahead of 5G deployments while an additional 47 percent intend to deploy MECs.
As these new computing locations supporting 5G come online, the ability to remotely monitor and manage increasingly dense networks becomes more critical to maintaining profitability. In the area of remote management, data centre infrastructure management (DCIM) was identified as the most important enabler (55 percent), followed by energy management (49 percent). Remote management will be critical, as the report suggests the network densification required for 5G could require operators to double the number of radio access locations around the globe in the next 10-15 years.
The survey also asked respondents to identify their plans for dealing with energy issues today and five years in the future when large portions of the network will be supporting 5G, which 94 percent of participants expect to increase network energy consumption. Among the key findings:
“5G represents the most impactful and difficult network upgrade ever faced by the telecom industry,” said Brian Partridge, research vice president for 451 Research. “In general, the industry recognises the scale of this challenge and the need for enabling technologies and services to help it maintain profitability by more efficiently managing increasingly distributed networks and mitigating the impact of higher energy costs.”
Growth continues to top the list of CEO business priorities in 2019 and 2020, according to a recent survey of CEOs and senior executives by Gartner, Inc. The most notable change in comparison to last year’s results is that a growing number of CEOs also deem financial priorities important, especially profitability improvement.
The annual Gartner 2019 CEO and Senior Business Executive Survey of 473 CEO and senior business executives in the fourth quarter of 2018 examined their business issues, as well as some areas of technology agenda impact. In total, 473 business leaders of companies with $50 million or more and 60% with $1 billion or more in annual revenue were qualified and surveyed.
“After a significant fall last year, mentions of growth increased this year to 53%, up from 40% in 2018,” said Mark Raskino, vice president and distinguished analyst at Gartner. “This suggests that CEOs have switched their focus back to tactical performance as clouds gather on the horizon.”
At the same time, mentions of financial priorities, cost and risk management also increased (see Figure 1). “However, we did not see CEOs intending to significantly cut costs in various business areas. They are aware of the rising economic challenges and proceeding with more caution — they are not preparing for recession,” said Mr. Raskino.
Figure 1: Top 11 Business Priorities of CEOs
Source: Gartner (May 2019)
New Opportunities for Growth
The survey results showed that a popular solution when growth is challenged is to look in other geographic locations for growth. Responses mentioned other cities, states, countries and regions, as well as “new markets” would also include some geographic reach (though a new market can also be industry-related, or virtual).
“It is natural to use location hunting for growth when traditional and home markets are saturated or fading,” said Mr. Raskino. “However, this year the international part of such reach is complicated and compounded by a shift in the geopolitical landscape. Twenty-three percent of CEOs see significant impacts to their own businesses arising from recent developments in tariffs, quotas and other forms of trade controls. Another 58% of CEOs have general concerns about this issue, suggesting that more CEOs anticipate it might impact their businesses in future.”
Another way that CEOs seem to be confronting softening growth prospects and weakening margins is to seek diversification — which increasingly means the application of digital business to offer new products and revenue-producing channels. Eighty-two percent of respondents agreed that they had a management initiative or transformation program underway to make their companies more digital — up from 62% in 2018.
High Hopes for Technology
Cost management has risen in CEO priorities, from No. 10 in 2018 to No. 8 today. When asked about their cost-control methods, 27% of respondents cited technology enablement, securing the third spot after measures around people and organization, such as bonuses and expense and budget management. However, when asked to consider productivity and efficiency actions, CEOs were much more inclined to think of technology as a tool. Forty-seven percent of respondents mentioned technology as one of their top two ways to improve productivity.
Digital Skills for All Executives
Digital business is something the whole executive committee must be engaged in. However, the survey results showed that CEOs are concerned that some of the executive roles do not possess strong or even sufficient digital skills to face the future. On average, CEOs think that sales, risk, supply chain and HR officers are most in need of more digital savvy.
Once all executive leaders are more comfortable with the digital sphere, new capabilities to execute on their business strategies will need to be developed. When asked which organizational competencies their company needs to develop the most, 18% of CEOs named talent management, closely followed by technology enablement and digitalization (17%) and data centricity or data management (15%).
“Datacentric decision-making is a key culture and capability change in a management system that hopes to thrive in the digital age. Executive leaders must be a role model to encourage and foster data centricity and data literacy in their business units and the organization as a whole,” Mr. Raskino said.
Supply chain to suffer blockchain ‘fatigue’
Blockchain remains a popular topic, but supply chain leaders are failing to find suitable use cases. By 2023, 90% of blockchain-based supply chain initiatives will suffer ‘blockchain fatigue’ due to a lack of strong use cases, according to Gartner, Inc.
A Gartner supply chain technology survey of user wants and needs* found that only 19% of respondents ranked blockchain as a very important technology for their business, and only 9% have invested in it. This is mainly because supply chain blockchain projects are very limited and do not match the initial enthusiasm for the technology’s application in supply chain management.
“Supply chain blockchain projects have mostly focused on verifying authenticity, improving traceability and visibility, and improving transactional trust,” said Alex Pradhan, senior principal research analyst at Gartner. “However, most have remained pilot projects due to a combination of technology immaturity, lack of standards, overly ambitious scope and a misunderstanding of how blockchain could, or should, actually help the supply chain. Inevitably, this is causing the market to experience blockchain fatigue.”
The budding nature of blockchain makes it almost impossible for organizations to identify and target specific high-value use cases. Instead, companies are forced to run multiple development pilots using trial and error to find ones that might provide value. Additionally, the vendor ecosystem has not fully formed and is struggling to establish market dominance. Another challenge is that supply chain organizations cannot buy an off-the-shelf, complete, packaged blockchain solution.
“Without a vibrant market for commercial blockchain applications, the majority of companies do not know how to evaluate, assess and benchmark solutions, especially as the market landscape rapidly evolves,” said Ms. Pradhan. “Furthermore, current creations offered by solution providers are complicated hybrids of conventional blockchain technologies. This adds more complexity and confusion, making it that much harder for companies to identify appropriate supply chain use cases.”
As blockchain continues to develop in supply chains, Gartner recommends that organizations remain cautious about early adoption and not to rush into making blockchain work in their supply chain until there is a clear distinction between hype and the core capability of blockchain. “The emphasis should be on proof of concept, experimentation and limited-scope initiatives that deliver lessons, rather than high-cost, high-risk, strategic business value,” said Ms. Pradhan.
Global grocers will use blockchain
Meanwhile, Gartner, Inc. predicts that, by 2025, 20% of the top 10 global grocers by revenue will be using blockchain for food safety and traceability to create visibility to production, quality and freshness.
Annual grocery sales are on the rise in all regions worldwide, with an emphasis of fast, fresh prepared foods. Additionally, customer understanding has increased for the source of the food, the provider’s sustainability initiative, and overall freshness. Grocery retailers who provide visibility and can certify their products according to certain standards will win the trust and loyalty of consumers.
“Blockchain can help deliver confidence to grocer’s customers, and build and retain trust and loyalty,” said Joanne Joliet, senior research director at Gartner. “Grocery retailers are trialing and looking to adopt blockchain technology to provide transparency for their products. Additionally, understanding and pinpointing the product source quickly may be used internally, for example to identify products included in a recall.”
Blockchain appears as an ideal technology to foster transparency and visibility along the food supply chain. Encryption capabilities on the food source, quality, transit temperature and freshness can be used to ensure that the data is accurate and will give confidence to both consumers and retailers.
Some grocers have already been experimenting with blockchain and are developing best practices. For example, Walmart is now requiring suppliers of leafy greens to implement a farm-to-store tracking system based on blockchain. Other grocers, such as Unilever and Nestlé, are also using blockchain to trace food contamination.
“As grocers are being held to higher standards of visibility and traceability they will lead the way with the development of blockchain, but we expect it will extend to all areas of retail,” Ms. Joliet said. “Similar to how the financial services industry has used blockchain, grocers will evolve best practices as they apply blockchain capabilities to their ecosystem. Grocers also have the opportunity to be part of the advancement of blockchain as they develop new use cases for important causes for health, safety and sustainability.”
Digital failure?
By 2021, only one-quarter of midsize and large organizations will successfully target new ways of working in 80% of their initiatives, according to Gartner, Inc. New ways of working include distributed decision making, virtual and remote work, and redesigned physical workspaces.
“Digital workplace initiatives cannot be treated exclusively as an IT initiative,” said Carol Rozwell, distinguished research vice president at Gartner. “When initiatives are executed as a series of technology rollouts, employee engagement and addressing the associated cultural change are left behind. Digital workplace success is impossible without such.”
Emerging Change Leadership
A new approach for coping with the shifting demands of digital business is emerging for digital workplace leaders — change leadership.
“Digital workplace leaders must realize that their role as the orchestrator of change is fundamentally moving away from previously ingrained leadership practices that view employees as a group resistant to change rather than involving them in co-creating the path forward,” said Ms. Rozwell.
Digital Workplace A-Team
As digital workplace leaders shift their thoughts and actions toward people-oriented designs, they can inspire and engage a cross-disciplinary “A Team” to help strategize news way of working. This “A Team” — drawn from IT, facilities management, human resources and business stakeholders — envisions how new technologies, processes and work styles will enhance the overall employee experience and enable them to perform mission-critical work more effectively. In the end, organizations that take time to invest in the employee experience will net a 10-percentage-point improvement in employee engagement scores.
Role of the Business Unit
Successful digital workplace programs are less about technology and more about understanding what affects the employee experience and making necessary changes to the work environment.
“The business unit leader is the champion of a new way of working in the digital workplace. This is the person who identifies the desired business outcomes, develops the business case and establishes the measures by which success is determined. Without engaging business unit leaders, it will be impossible to successfully grapple with the scope of changes needed,” said Ms. Rozwell.
In Europe we are witnessing a shift in technology spending from IT to the line of business (LOB). In a new update to the Worldwide Semiannual IT Spending Guide: Line of Business, International Data Corporation (IDC) forecasts that European technology spending by LOB decision makers will steadily grow and will increase faster than spending funded by IT organizations through 2022.
European companies are forecast to spend $399 billion on IT (hardware, software, and services) in 2019. More than half of that spending (58.9%) will come from the IT budget, while the remainder (41.1%) will come from the budgets of technology buyers outside of IT. Nonetheless, LOB technology spending will grow at a faster rate than IT spending in the years ahead. The compound annual growth rate (CAGR) for LOB spending over the 2017–2022 forecast period is forecast to be 5.9%, compared with a 2.9% CAGR for IT spending.
In 2019, banking, discrete manufacturing, and process manufacturing will have the largest spend coming from LOBs. Retail, professional services, and discrete manufacturing will have the fastest LOB spending growth over 2018.
"In Europe, business managers are raising their voice in the IT decision-making process," said Andrea Minonne, senior research analyst at IDC Customer Insights & Analysis. "This trend is revolutionizing and disrupting how companies make technology investments, with LOBs more often claiming control over IT budgets. The consumerization of applications, especially those to access content and improve collaboration, together with the more mainstream use of cloud solutions, the uptake of software as a service, and the BYOD area, are driving LOBs to make technology investments independently. This results in a greater tendency to skip IT department approval, which can sometimes take a long time and delay workloads."
Consumer technology spending to reach $1.3 trillion
Consumer spending on technology is forecast to reach $1.32 trillion in 2019, an increase of 3.5% over 2018. According to the inaugural Worldwide Semiannual Connected Consumer Spending Guide from International Data Corporation (IDC), consumer purchases of traditional and emerging technologies will remain strong over the 2018-2022 forecast period, reaching $1.43 trillion in 2022 with a five-year compound annual growth rate (CAGR) of 3.0%.
"The new Connected Consumer Spending Guide leverages IDC's long history of capturing consumer device shipments, combined with valuable insights from regular consumer surveys and channel discussions, to tell a comprehensive story about consumer spending," said Tom Mainelli, IDC's group vice president for Devices and Consumer Research. "The Connected Consumer Spending Guide team has built out an initial set of consumer-focused use cases designed to deliver insights about spending across a wide range of device types, from smartphones to tablets, PCs to drones, and smart speakers to wearables. Over time, the team will continue to develop an ever-widening array of use cases, adding additional data about software and services, and eventually demographic-focused insights."
Traditional technologies – personal computing devices, mobile phones, and mobile telecom services – will account for more than 96% of all consumer spending in 2019. Mobile telecom services will represent more than half of this amount throughout the forecast, followed by mobile phones. Spending growth for traditional technologies will be relatively slow with a CAGR of 2.4% over the forecast period.
In contrast, emerging technologies, including AR/VR headsets, drones, robotic systems, smart home devices, and wearables, will deliver strong growth with a five-year CAGR of 20.6%. By 2022, IDC expects more than 5% of all consumer spending will be for these emerging technologies. Smart home devices and smart wearables will account for nearly 70% of the overall spending on emerging technologies in 2019. Smart home devices will also be the fastest growing technology category with a five-year CAGR of 38.0%.
"Connected technologies are transforming consumers' activities and habits, becoming more and more integrated into their daily lives. This is fueling the consumer's unquenchable thirst for content and immersive experiences delivered anytime, anywhere, via multiple formats and across a myriad of channels. As a result, we see the balance of power shifting in consumer-facing industries. Whereas once upon time, the enterprise called the shots, more and more consumer demands and expectations are propelling innovation," said Jessica Goepfert, program vice president, Customer Insights & Analysis at IDC. "What's the next wave of consumer transformation? Even more widely adopted and mature activities such as listening to music and shopping are being disrupted by new technologies such as smart speakers. And disruption presents opportunity."
Communication will be the largest category of use cases for consumer technology, representing nearly half of all spending in 2019 and throughout the forecast. Most of this will go toward traditional voice and messaging services, joined by social networking and video chat as notable use cases within this category. Entertainment will be the second largest category, accounting for nearly a quarter of all spending as consumers listen to music, edit and share photos and videos, download and play online games, and watch TV, videos, and movies. The use cases that will see the fastest spending growth over the forecast period are augmented reality games (82.9% CAGR) and home automation (59.8% CAGR).
"There's an expectation among today's consumers for a seamless consumer experience. The connected consumer is no longer a passive one; the connected business buyer is in control and it's essential for technology providers to understand this if they want to continue to grow and gain market share in this digital age. As technology becomes more affordable and accessible, the connected consumer is expected to spend more as they leverage these platforms for entertainment, education, social networking, commerce, and other purposes. IDC's Worldwide Semiannual Connected Consumer Spending Guide presents a comprehensive view of the consumer ecosystem and serves as a framework for how IDC organizes its consumer research and forecasts," said Stacey Soohoo, research manager with IDC's Customer Insights & Analysis group.
Connected vehicles shipments to reach 76 million by 2023
In its inaugural connected vehicle forecast, International Data Corporation (IDC) estimates that worldwide shipments of connected vehicles, which includes options for embedded and aftermarket cellular connectivity, will reach 51.1 million units in 2019, an increase of 45.4% over 2018. By 2023, IDC expects worldwide shipments to reach 76.3 million units with a five-year compound annual growth rate of 16.8%.
IDC defines a connected vehicle as a light-duty vehicle or truck that contains a dedicated cellular network wireless wide area connection that interfaces with the vehicle data (e.g., gateways, software, or sensors). Newer, recent model year vehicles are shipped with an embedded, factory-installed connected vehicle system. Older vehicles typically connect via an aftermarket device, which is a self-contained hardware and software unit that is installed into a vehicle's OBDII port.
The commitment of mass market automotive brands to make embedded cellular standard equipment in key markets was a major development in the arrival of connected vehicles. By 2023, IDC predicts that nearly 70% of worldwide new light-duty vehicles and trucks will be shipped with embedded connectivity. Likewise, IDC expects that nearly 90% of new vehicles in the United States will be shipped with embedded connectivity by 2023.
The sustained growth of the connected vehicles market is being driven by a multitude of factors, including consumer demand for a more immersive vehicle experience, the ability of auto manufacturers to better utilize connected vehicles for cost avoidance and revenue generation, evolving government regulations, and mobile network operator investments in new connections and services.
"The automotive ecosystem is positioning the vehicle as the next, emerging digital platform," said Matt Arcaro, research manager, Next-Generation Automotive Research at IDC. "Deploying embedded or aftermarket connectivity at scale will be key to unlocking its potential."
Telecoms market prepares for 5G impact
Worldwide spending on telecom services and pay TV services totaled $1.6 billion in 2018, reflecting an increase of 0.8% year on year, according to the International Data Corporation (IDC) Worldwide Telecom Services Database. IDC expects the worldwide spending on telecom and pay TV services to reach nearly $1.7 billion in 2023.
Mobile services will continue to dominate the industry in terms of spending, with mobile data still expanding, driven by the booming smartphone markets. At the same time, growth in mobile voice is slowly declining due to fierce competition and market maturity. The mobile segment, which represented 53.1% of the total market in 2018, is expected to post a compound annual growth rate (CAGR) of 1.4% over the 2019-2023 period, driven by the growth in mobile data usage and the Internet of Things (IoT), which are offsetting declines in spending on mobile voice and messaging services.
Fixed data, especially broadband internet access, is still expanding in most geographies, supported by the increasing importance of content services for consumers and IP-based services for businesses. Fixed data service spending represented 20.5% of the total market in 2018 with an expected CAGR of 2.6% through 2023, driven by the need for higher-bandwidth services. Spending on fixed voice services will record a negative CAGR of 5.3% over the forecast period and will represent only 8.5% of the total market through 2023. Rapidly declining TDM voice revenues are not being offset by the increase in IP voice.
The pay TV market, which consists of cable, satellite, IP, and digital terrestrial TV services, will remain flat over the forecast period; however, these services are an increasingly important part of the multi-play offerings of telecom providers across the world. Spending on multi-play services increased by 7.1% in 2018 and is expected to post a CAGR of 3.7% by the end of 2023.
On a geographic basis, the Americas was the largest services market, with revenues of $616 billion in 2018, driven by the large North American sector. Asia/Pacific was the second largest region, followed by Europe, the Middle East, and Africa (EMEA). The market with the fastest year-on-year growth in 2018 was EMEA (mainly by the emerging markets), followed by Asia/Pacific.
Global Regional Services 2018 Revenue and Year-on-Year Growth | ||
Global Region | 2018 Revenue ($B) | CAGR 2018-2023 (%) |
Americas | 616 | 0.0 |
Asia/Pacific | 512 | 0.8 |
EMEA | 487 | 0.9 |
Grand Total | 1,615 | 0.5 |
Source: IDC Worldwide Semiannual Services Tracker 2H 2018 |
The calm stability that currently marks the telecom services market will not last for long. The advent of 5G is the focus of massive media attention as it represents new architectures, speeds, and services that will remake the mobile landscape.
"5G will unlock new and existing opportunities for most operators as early use cases such as enhanced mobile broadband and fixed wireless access will be gaining traction rapidly in most geographies, while massive machine-type communications and ultra-reliable low-latency communications will debut in more developed countries," says Kresimir Alic, research director with IDC's Worldwide Telecom Services. "Additionally, the worldwide transition to all-IP and new-generation access (NGA) broadband will help offset the fixed and mobile voice decline. We are witnessing a global digital transformation revolution and carrier service providers (CSPs) will play a crucial role in it by innovating and educating and by supporting the massive roll-out of software and services."
The DCS Awards, organised by the publisher of Digitalisation World, Angel Business Communications, , go from strength to strength. This year’s event, sponsored by Kohler Uninterruptible Power was MC’d by Paul Trowbridge, the MD of Angel’s sister event management business, EVITO. Following on from the excellent dinner, there was a short industry-focused speech from the Data Centre Alliance’s Steve Hone, and then the first half of the night’s entertainment - all kindly sponsored by Starline – with Zoe Lyons’s brilliant observational comedy. The DCS Awards were then handed out before an after-hours casino, with music from Ruby and the Rhythms. Here we focus on the winners.
Data Centre Energy Efficiency Project of the Year:
Sponsored by: Kohler Uninterruptable Power
WINNER: Aqua with 4D (Gatwick Facility)
Collecting: Mike West, Data Centre Projects ManagerNew Design/Build Data Centre Project of the Year
Sponsored by: Starline
WINNER: POWER CONTROL with CoolDC
Collecting: Sam Rea, Business Development Mgr, Power Control & Angela Meah, Cool DCCLOUD Project of the Year
WINNER: SureCloud with the Equiom Group
Collecting: Luke Potter, Operations Director (Cybersecurity)
Data Centre Consolidation / Upgrade / Refresh Project of the Year
Sponsored by: Power Control
WINNER: SUDLOWS with the Science & Technology Facilities Council
Collecting: Andy Hirst – MD of Critical InfrastructuresMANAGED SERVICES Project of the Year
WINNER: ALTARO with Chorus
Collecting: Keith Joseph – Chanel Manager Northern EuropeGDPR COMPLIANCE Project of the Year
Sponsored by: Kohler Uninterruptable Power
WINNER: SURECLOUD with Everton FC
Collecting: Brian Nevitt Senior Account ExecutiveData Centre POWER Innovation of the Year
WINNER: HUAWEI FusionPower Solutions
Collecting: James Coughlan & Hans HeckmanPDU Innovation of the Year Award
WINNER: ServerTech’s HDOT Cx PDU
Collecting: James Giblette, BU Director – UK & IrelandData Centre COOLING INNOVATION of the year:
WINNER: Transtherm and 2bm – Budget-friendly, Compressor-less Cooling Solution
Collecting: Clayton D’Souza - OEM Accounts Manager – Transtherm Cooling IndustriesData Centre INTELLIGNET AUTOMATION & MANAGEMENT INNOVATION of the year
Sponsored by: The Data Centre Alliance
WINNER: Nlyte Software’s Dedicated Machine Learning Solution
Collecting: James Stuart, Regional Sales DirectorData Centre Physical Connectivity Innovation of the year
Sponsored by: Legrand
WINNER: Corning Optical Communications
Collecting: Cindy Ryborz, Marketing Manager, Data Center EMEAData Centre ICT Storage INNOVATION of the Year
Sponsored by: Schneider Electric
WINNER: DATACORE
Collecting: Brett Denly, Regional DirectorData Centre ICT Security Innovation of the Year
Sponsored by: SureCloud
WINNER: CHATSWORTH PRODUCTS
Collecting: Collecting on Chatsworth’s behalf - Peter Davies of AngelData Centre ICT Management Innovation of the Year
WINNER: Schneider Electric
Collecting: Marc Garner – VP, UK & IrelandData Centre ICT NETWORKING Innovation of the Year
Sponsored by: TIMICO
WINNER: BRIDGEWORKS
Collecting: CEO David TrossellData Centre ICT AUTOMATION Innovation of the Year
WINNER: MORPHEUS DATA
Collecting: Sam Rea, Business Development Mgr, Power Control & Angela Meah, Cool DCOPEN SOURCE Innovation of the Year
Sponsored by: DCA
WINNER: OVH
Collecting: Hiren Parekh, Senior Director of Cloud ServicesThe Data Centre Managed Services Innovation of the Year
Sponsored by long-standing supporters: Data Centre World
WINNER: SCALE COMPUTING WITH CORBEL
Collecting: Sam Rea, Business Development Mgr, Power Control & Angela Meah, Cool DCData Centre Hosting/Co-location Supplier of the Year
Sponsored by: NLYTE SOFTWARE
WINNER: GREEN MOUNTAIN
Collecting: Tor Kristian Gyland, CEODatacentre Cloud Vendor of the Year
WINNER: ARCSERVE
Datacentre FACILITIES Vendor of the Year
Sponsored by: The DCA
WINNER: CBRE
Collecting: Kevin Kearns, Business Unit Director, FM ServicesThe EXCELLENCE IN DATA CENTRE SERVICES AWARD
Sponsored by: RIELLO UPS
WINNER: CURVATURE
Collecting: Christo Conidaris, VP of Sales EMEADatacentre MANAGER of the Year
Sponsored by: NAVISITE
WINNER: Simon Binley - Wellcome Sanger Institute
Collecting: Simon BinleyDatacentre ENGINNER of the Year
Sponsored by: CBRE
WINNER: SAM WICKS of SUDLOWS
Collecting: Sam WicksDCS Industry Award
Each year Angel likes to recognise a significant contribution to or outstanding achievement in the industry – this award is not voted for by the readers of the publications but based on the publisher’s and editorial staff’s opinions.
WINNER: NAVISITE
Collecting: Aaron Aldrich Client Success DirectorNavisite’s recent recognition as an Azure Expert Managed Service Provider is the latest demonstration of their commitment to excellence in deploying and managing on-premise and colo-cloud environments, with over 10,000 VMs under management.
Navisite helps clients use the cloud strategically, pairing employee expertise across leading enterprise applications and more than 1,400 IT certifications with an international footprint of state-of-the-art data centers to deliver solutions tailored to each client’s needs.
Curvature recently won the Excellence in Data Centre Services DCS Award, organized by the publisher of Digitalisation World, Angel Business Communications. Here we talk to Christo Conidaris, VP Sales EMEA, Curvature, about the benefits of third party maintenance.
1. Please can you provide some brief background on the company:
For the past 30 years, Curvature has been transforming and challenging the traditional management of data centres worldwide. We have achieved this through the provision of innovative maintenance solutions, world-class professional services that span end-to-end, and extremely competitive upgrade costs for multi-vendor equipment and support. For data centre management teams, Curvature makes an impact by extending the lifecycles of IT assets, decreasing costs and freeing staff from daily support overheads. With over 1500 employees worldwide, Curvature is the established independent global leader in IT maintenance services and support, first-class professional services and innovative products and solutions.
2. And what are Curvature’s USPs for the data centre market? –
When you are leading the way in third party maintenance, the outstanding quality of your people are your key asset and differentiator. Curvature offers specifically trained field engineering teams to make sure that the customer receives the best possible service. Our badged engineers come with decades’ worth of experience and are backed by a strong Global Central Engineering team that provides development, training and L3 support. Curvature also boasts Centres of Excellence to facilitate testing and hands on training in live environments, such as their recent multi-million dollar investment in a mainframe & complex servers Innovation Lab.
Curvature also have over 100 service centres offering a wholly-owned spare parts inventory (non-shared logistics) located around the globe and frequently outstrips delivery times and availability that OEMs can offer. Curvature customers are kept centre stage in communications with instant delivery updates via a portal that tracks service activity and changes.
3. Can you give us an overview of the company’s data centre products and services portfolio?
Curvature’s Third Party Maintenance services extend to storage, servers/complex servers, mainframes & networking pillars. For Professional Services, Curvature offers SLA based Remote Hands (Global Remote Hands, IMACD); Assessments (Site Assessment Audits, Network Assessments, Server and Storage Assessments) Transition (Data Centre Relocation, Data Centre Migration, Data Replication); ITAD (IT Asset Disposition, EUC ITAD), Consulting (Network assessment, server storage assessment, Design/Optimise/Configure, Advanced Technical Consulting); Implementation (Rack&Stack, Installation, Deployment); Cloud Advisory Services (Strategy Cloud, Transition Readiness, Placement, Strategy X-Ray).
4. And what are some of the most recent additions to this?
Curvature invests significantly in ongoing R&D efforts identifying the most relevant solutions that customers require, continually enhancing our portfolio to address market and customer needs. As examples, from a maintenance perspective, Curvature have recently started offering maintenance on Juniper Networks to its portfolio, and have also added support for the IBM Netezza high performance storage and data warehouse platform. Under the umbrella of professional services, these now extend to Health Checks and Performance Checks on storage systems, and CloudLogic advisory services for cloud optimisation.
5. And what advice would you give to individuals looking at using third party hardware maintenance?
Before moving to a third party maintenance contract, it is wise to do some groundwork to check your expectations will be met long-term. We meet countless customer prospects who have been made promises that remain unfulfilled. To help from the outset, Curvature has developed a checklist of 5 essential checks that you should always research ahead of identifying the shortlist of TPMs that you think would best fit your organisation. The first (fairly obvious) check is to compare how long the TPM has been in business. It goes without saying that the longer a TPM has been successfully trading, the more established and reliable its services will be, and the more reference customers it will offer. Long-term TPMs develop into equally long-term strategic partners for years to come. Second, check that the geographical coverage offered mirrors your own IT infrastructure. It is no good having 12 offices in the UK and none in the ROW when you trade across continents 24x7. Check carefully where Forward Stocking Locations (FSLs) and sparing depots are placed, city-by-city, in order that you can meet and exceed your own internal SLAs with a provider that offers 24x7x365 global support. Thirdly, examine in detail the prospect TPM’s 100% sparing ‘philosophy’. Ensure that all hardware parts under contract are tested and spared at a location close enough to meet your SLAs. Delve deep into answers on this part; it is critical to understand and compare the sparing process in detail. Question who owns the part and how have parts been tested? Also, ensure that a provider will grow with you – even if that means adding FSLs around the globe to continue to meet your needs. Fourth, and again this should be a tick box exercise, but check the TPMs certifications are industry recognised certifications and carry the highest levels of quality and security standards internationally - including TL9000 (for telecoms), ISO 27001:2013 for Information Security Management Systems, and responsible recycling. Lastly, check that all the equipment you have in your IT infrastructure can be managed in one portal. This way multi-generation, multi-vendor and multi-national support contracts are all accessible to you through a centralised portal with a ticket management system.
6. Can you tell us what winning the Excellence in Data Centre Services DCS Award means to the company?
This award means a great deal to Curvature for two reasons. Firstly, it is project-based, so it details in depth the experiences of three of our European customers and the values they have derived from working with Curvature over sustained timescales. Secondly, it’s independently highlighted by the judging team, then verified, and voted en-masse by the thousands of readers of DW and our own customers. The scale of readership voting gives an independent endorsement and verification as to the ongoing value we quietly and consistently deliver to market. It’s strong testimonials like these three, plus taking the voting lead in a very strong, very full category, that validates the TPM alternative proposition that OEMs have long tried to suppress!
7. And can you talk us through the winning entry – what are the key distinguishing features and/or USP?
This winning entry was all about detailing the end-to-end Curvature service experience across 3 installs and highlighting the positive user impacts that have been derived in their switch to our TPM provision. In fact, over 15,000 other data centres worldwide are today also embracing Curvature service levels. Over the past ten years, Curvature have been steadily growing the third party service market segment to such an extent that they are now established as the world’s largest independent provider of multi-vendor maintenance and 24/7 support services. Growth is remarkable given it has been largely driven through customers’ recognition and offering peer-to-peer validation and recommendation - and often on a global scale.
8. Why is this important to data centres?
Today, using Curvature, we have exemplified in the entry how data centre managers now have a realistic alternative to vendor support for brands such as Cisco, HP and IBM, that actively cuts capital costs, extends lifecycles of data centre assets and reduces e-waste significantly. We have shown that this represents a healthy market alternative that will stretch IT budgets to afford and encompass innovation. Creating this supporting infrastructure was not achieved overnight. Curvature have made massive investments in hiring and retaining talented personnel working from a backdrop of global locations.
For data centre services the entry highlights how Curvature uniquely offers assessment, consolidation, relocation, and cloud migration services, among others. Curvature’s focus supporting Network, Server and Storage assets that can scale and grow, appeals to mainstream IT environments. While for more specialised environments such as mainframes, Curvature uniquely supports more mainframe environments than any other (bar IBM) via 60 mainframe global service centres offering parts and expertise that extends from 1993 mainframe models and parts, through to the latest generation (n-1). Curvature has also invested more than $5 million since 2008 building its own zSeries innovation lab, where the central engineering team operates and develops best-in-class mainframe support methodologies, proprietary “call-home” tools, hands-on testing of spare parts and instructor-led training for field engineers.
9. What tangible impact has your company had on the market and your customers?
One impact is viewed as upsetting the OEM Applecart – we offer Data Centre managers a realistic, viable choice for service and support that saves money and increases service levels. With such a growing global movement toward consideration and adoption of Third Party Maintenance, the greatest footprint that Curvature has forged is challenging vendor-instilled behaviours so that data centre managers now have an alternate path outside of the OEM. No longer do UK data centres face a stark rip-and-replace decision underpinned and dictated by the OEM every 3-5 years as hardware falls out of warranty.
Today, using multi-vendor umbrella service agreements, Curvature supports and extends network, server and storage systems through 24/7 maintenance coupled with provision of pre-owned and new hardware. The impact is significant as Danish retail company, DKI Group elaborates. DKI Group used Curvature to recommend a blend of new and pre-owned hardware to optimise their budget without making any compromises in performance or security. ‟We came up with a solution for what we needed to achieve,” recalls Christian Hingelberg Hangaard CFO of DKI ‟Curvature met all of our performance requirements and then suggested adding pre-owned servers, which reduced our capital expenditures significantly.”
10. What levels of customer service differentiate you from your competitors?
Each day, Curvature goes beyond required service levels to thrill customers and exceed expectations, often on a global scale. Every employee embraces that providing outstanding service is the essence of what Curvature does, supporting over 1.25 million devices and leading the sector in the process. Dave Kelly, Connectivity Service Line Program Manager for Curvature customer Schneider Electric details the point. “Curvature’s customer service is absolutely the best I’ve ever experienced. Nothing is ever a problem and there is always someone on the other end of the phone to escalate any issue. Response times are great!”
Through Curvature’s joined approach of preowned hardware, Maintenance and Professional Services, global data centres can be viewed holistically and without constraints of geography, producing remarkable savings, often to a factor of 60% or more.
Dave details his global experience at Schneider Electric. “It was a challenge to integrate all the networking elements for existing and recently acquired companies on the same page. We needed help with a never-ending list of transformation projects to ease the process of consolidating different network vendors. What we really wanted was a one-stop shop to handle all our global needs, but we never really found one company that could do it all with the same level of support and service excellence - We needed a partner that could help us meet the relentless reality of operations issues, equipment deployments and new projects.”
Enter Curvature’s NetSure® third-party maintenance (TPM) which enabled Schneider Electric to extend the useful life of its highly reliable Cisco gear well beyond its five-year lease. Not only does Curvature offer hardware extension through extended warranty support, it also offers bespoke services using proven methodologies for assessments and optimisation, migration, project management, design, relocation and remote staff augmentation. At all times while meeting formal methodologies, Curvature remains approachable, responsive and inclusive of business needs.
Carl Christensen (CAC) - a leading supplier of spare parts to the automotive, industrial and marine markets in Denmark adopted Curvature’s IT Infrastructure Design and Implementation Services to improve services and benefits for customers. ‟We were confident in the approach taken by the Curvature engineers as they explained the process, answered all our questions and followed through on each step. The team was completely hands-on, which made it easy to get help when we needed it. We immediately had the opportunity to demonstrate that we had done our due diligence and picked the right partner,” concludes Asger R. Poulsen, CIO at CAC. ‟ I can strongly recommend Curvature for their skills, dedication and comprehensive infrastructure design, implementation and support services. Last supporting service quotes go back to Schneider. “Curvature’s service levels went way above and beyond the service you can usually expect, the level of Curvature’s commitment never ceases to impress.”
11. And what can we expect from Curvature in the future?
Market acceptance of TPMs means the future looks increasingly exciting, as TPM is now a mainstream validated alternative. Recent history shows that increasingly more data centre products and solutions have fallen under the TPM’s remit. Curvature’s Central Engineering and R&D teams are constantly validating, expanding and innovating to meet data centre expectations and exceeding service levels. Professional service offerings continue to grow to facilitate on demand, digital service delivery and empowerment, with Curvature cloud migration and public and private Cloud optimisation services also increasing.
TPM in action
Schneider Electric Teams with Curvature to Power Worldwide Connectivity and Continuous Innovation
Schneider Electric is leading the digital transformation of energy management and automation in homes, buildings, data centres, infrastructure and industries. The company’s flagship product, EcoStruxure™, is an open, interoperable, IoT-enabled system architecture and platform designed to deliver enhanced value around safety, reliability, efficiency, sustainability, and connectivity.
EcoStruxure leverages advancements in IoT, mobility, sensing, cloud, analytics and cybersecurity to provide innovation at every level, encompassing connected products, edge control, as well as applications, analytics and services. With more than 140,000 employees working from 1,100 locations in over 100 countries, Schneider Electric is the undisputed global leader in power management.
According to Dave Kelly, Connectivity Service Line Program Manager for Schneider Electric, a top priority is ensuring that both employees and customers can connect to the services and information they need anywhere, anytime and on any device. “Our purpose is to provide a network experience that fulfils user requirements and expectations,” he explains. “We aspire to offer connectivity-as-a-service that is user-centric, talent nurturing, business-oriented and technologically innovative.”
The challenge to that end, an 80-person global operations team is responsible for deploying and maintaining a high-performance MPLS wide-area network featuring about 95% state-of-the-art Cisco switches and routers. Driven by an overarching commitment to innovation, diversity and sustainability, Schneider Electric is migrating users and applications to the cloud while exploring the benefits of a hybrid networking model using Software-Defined Networking (SDN) and white-box solutions.“
We like to stay ahead of the curve,” says Kelly. “Schneider Electric has a dedicated innovation team that keeps pace with the latest digital technologies to ensure both internal and external customers are always connected in support of our ‘Life is On’ mission.” In keeping with its mission, Schneider Electric moved away from regional IT support teams in 2014 to a global model with standardized methodologies and follow-the-sun support. “Our goal is to create a solid framework whereby our end-to-end services run smoothly while delivering all the capabilities to meet any customer requirement,” he adds.
Due to consistent, organic business growth and a steady stream of acquisitions and business expansions, the operations team needed to build a cohesive network foundation that connected new locations and increasing numbers of users worldwide. “It was a challenge to integrate all the networking elements for existing and recently acquired companies on the same page,” says Kelly. “We needed help with a never-ending list of transformation projects to ease the process of consolidating different network vendors.”
Historically, Schneider Electric relied on large services providers, like Dimension Data, IBM and Unisys, as well as Gold channel partners, to assist with deployments, upgrades and ongoing support worldwide. The challenge, however, was achieving a consistently high level of responsive support and expert services at all locations. “What we really wanted was a one-stop shop to handle all our global needs,” says Kelly. “But we never really found one company that could do it all with the same level of support and service excellence.”
In addition to meeting its standards for quality service and support worldwide, Schneider Electric needed to ensure cohesive, consistent connectivity for all locations and users. “We wanted greater control and more visibility across the network,” recalls Kelly. “Standardizing on solutions in keeping with global best practices was critical. We needed a global partner that could help us meet the relentless reality of operations issues, equipment deployments and new projects.”
The Solution
To assist with the fast growth pace, Schneider Electric’s expert procurement team oversees all hardware purchases, including the predominant presence of Cisco networking equipment. Typically, most purchases and hardware upgrades were predicated by five-year lease terms, but a well-timed call from a Curvature representative to a procurement specialist at a Schneider Electric facility in Barcelona offered insights into a more progressive and flexible path for maintaining the company’s global network.
An initial meeting revealed that Curvature’s NetSure® third-party maintenance (TPM) would enable Schneider Electric to extend the useful life of its highly reliable Cisco gear well beyond its five-year lease. Moreover, as the world’s largest independent IT maintenance company, Curvature clearly had the scope and depth of experience to help optimize Schneider Electric’s network investments while improving product lifecycle management. “We saw an immediate opportunity to lower maintenance costs with Curvature,” notes Kelly. “Over the course of the next year, the value of using Curvature as an alternative to legacy manufacturer support became obvious.”
Not only was Curvature’s pricing very competitive in comparison to manufacturer and channel pricing for hardware and services, but the independent service provider exceeded all others in delivery of expedited support. “Curvature’s customer service is absolutely the best I’ve ever experienced,” comments Kelly. “Nothing is ever a problem and there’s always someone on the other end of the phone to escalate any issue. Response times are great!”
Curvature’s worldwide presence was another major plus. “Curvature’s global reach is terrific,” Kelly adds. “If they didn’t already have onsite presence at one of our locations, they were working on it. It became clear we could leverage Curvature’s global presence to help address our global requirements.”
While Kelly was sold on Curvature’s extensive menu of services and proven capabilities, it took a bit longer for internal stakeholders to understand how alignment with one global solutions provider could produce far-reaching benefits. In fact, as Curvature took on ever-increasing maintenance activities, Kelly identified additional opportunities to reap substantial savings and support improvements by bringing other locations under Curvature’s TPM umbrella.
Curvature went on to earn company-wide acceptance when Schneider Electric issued an RFP for a worldwide maintenance plan following successful stints with Curvature managing support in North America and EMEA. “This was an eye-opener, as the global support RFP featured stiff competition from legacy providers,” recalls Kelly. “Curvature beat everyone while offering 60% savings in an apples-to-apples comparison. We even added more items to the scope and Curvature still produced overall savings of more than 45% over competing bids.”
The Benefits
As Schneider Electric’s one-stop shop for global network maintenance and support, Curvature is often the first and last line of defence when a piece of equipment needs upgrading or replacement. “One thing that really sets Curvature apart is the actual lead time for delivering equipment,” says Kelly. “Curvature delivers in 24 hours compared to the eight to 12 weeks it would take for the manufacturer or Gold channel partners to deliver. Curvature gives us huge agility.”
That agility is helping Schneider Electric extend NetSure into the company’s APAC locations over the next year. “Looking past the competitive pricing, it’s really about the value we get from our whole Curvature relationship that makes the biggest difference,” says Kelly. “Dealing with support issues—our experience with Curvature is very, very good service delivery—even if we have a problem with equipment that is not on Curvature maintenance.”
Case in point: Schneider Electric suffered a switch outage at a large manufacturing facility near Nice, France in the summer of 2016. The failure, which occurred on a Friday, threatened to impact operations when the manufacturer involved was unresponsive. Then Kelly called Curvature, and the response was immediate, with personnel and a replacement switch arriving the next day.
After working with Schneider Electric’s operations team to restore functionality over the weekend, Curvature returned the following weekend to add redundancy to the critical network element. “This went way above and beyond the service you can usually expect,” explains Kelly. “And it was very well noticed by upper management as the level of Curvature’s commitment never ceases to impress.”
A more recent switch failure in the Netherlands was quickly remedied by the Curvature team, even though the equipment wasn’t covered under NetSure initially. “Curvature shipped a new switch and restored service in less than five hours,” states Kelly. “We couldn’t have done that with any other partner or the manufacturer—only Curvature could respond that fast.”
In addition to expedited support responses, Schneider Electric has started to leverage Curvature’s alternative procurement strategies for additional cost savings on network equipment. The purchases typically entail refreshing incumbent gear that is not compliant globally or upgrades as needed. The ability to purchase pre-owned equipment from Curvature also has enabled Schneider Electric to quickly, easily and affordably meet miscellaneous requests from business units or new locations. Curvature’s professional, highly trained services team handles all the requests, working with Schneider Electric’s procurement department.
The consistent level of Curvature’s advanced technical expertise provides hands-on network analysis, which has proven instrumental in determining service level agreements (SLAs) for every device on Schneider Electric’s global network. Moreover, Curvature offers the best options for hybrid support, including SMARTnet and NetSure.
Together, Curvature and Schneider Electric hold quarterly business reviews to further refine networking strategies while identifying potential issues and areas of improvement before they impact operations. Looking ahead, Schneider Electric plans to lean more heavily on Curvature’s professional services team to handle ongoing deployment projects, empowering Schneider Electric to focus on strategic network improvements and user experience enhancements.
With Curvature’s help, Schneider Electric will build out its own service desk capabilities to align closely with continued business growth and global expansion. “Technology doesn’t fix problems; people do,” concludes Kelly. “We have such a good, direct partnership with Curvature and will continue to look for ways to use their professional services going forward—they give us a distinct competitive edge.”
DW talks to Hiren Parekh - Senior Director of Cloud Services, EMEA, OVH, about the company’s success as a cloud provider, with a particular focus on the Managed Kubernetes service.
1. Please can you provide some background on OVH?
OVH is a global, hyper-scale cloud provider that offers industry-leading performance and value to businesses. It represents the leading European cloud alternative. Founded in 1999, the group manages and maintains 28 datacentres in 12 sites, across four continents, deploys its own global fibre-optic network, and manages its entire supply chain for web hosting. Running on its own infrastructures, OVH provides simple, powerful tools and solutions to bring technology to businesses, and revolutionise the way that more than 1.4 million customers across the globe work. Respecting an individual’s right to privacy and equal access to new technologies are central to the company’s values, which is why the OVH motto is “Innovation for Freedom”.
2. And can you give us an overview of OVH’s strengths in what’s quite a crowded market?
Our smart cloud approach is what sets us apart. S - ‘Simple’ refers to our ability to provide quick, easy access to IT resources with simple pay-as-you-go billing. M – ‘Multi-local’ references our global reach, spanning four continents, across 28 datacentres. ‘A’ – accessible alludes to our impressive price/performance ratio, making cutting-edge technology accessible to all. ‘R’ for Reversible is very important because we are committed to fully-reversible, portable environments. The ‘T’ in smart stands for ‘Transparent’. Transparency is particularly valuable to us. We take the GDPR and data compliance very seriously and we maintain clear and transparent billing for all our products.
3. Please can you talk us through your products and services portfolio?
Our infrastructure-as-a-service (IaaS) portfolio is centred around our Public Cloud, Private Cloud and Dedicated Server ranges.
OVH Public Cloud offers you an extensive range of cloud solutions, billed on a pay-as-you-go basis, and set up in a simple way that supports your projects. You can take advantage of the flexibility of on-demand resources to scale up from small projects to large-scale deployments. This is an ideal solution for e-commerce sites or applications with unpredictable traffic, the OVH Public Cloud offers guaranteed CPU, RAM and bandwidth, with no hardware management required on your part. Through a single online interface, you can manage your entire infrastructure (server, storage, network etc.) with complete flexibility and autonomy. The comprehensive ecosystem of products and tools also includes the award-winning Managed KubernetesÒ Service and our Data Analytics Platform which is powered by Hadoop.
Private cloud is often defined as the type of cloud that companies build in-house, thus providing all the guarantees that come with a dedicated hardware infrastructure. The OVH Private Cloud combines the benefits of a dedicated infrastructure with all the flexibility and on-demand resources of the Public Cloud. It is a more sustainable, robust, secure, transparent cloud, and above all it is 100% dedicated on all infrastructure layers.
The strength of the OVH Private Cloud lies in its ability to provide you with a dedicated and managed infrastructure giving you full access to the vSphere hypervisor (as-a-service), which is exactly the same as what you would get when using VMware technologies on your infrastructure. This way, you can take advantage of all the power from your hardware and network capacities while OVH takes care of their operational maintenance. This is what we call "Dedicated IaaS."
We have a full range of Dedicated Servers customised to suit your business needs. With OVH, you are guaranteed our full range of expertise regarding bare-metal solutions. All our servers use next-generation components. Each machine is designed and assembled by us, then delivered in a record time of 120 seconds. We offer different ranges of highly-efficient dedicated servers, adapted to the most demanding needs of any type of company. Host your website, deploy your highly-resilient infrastructure, or customise your machine to suit your projects, in just a few clicks.
4. In particular, what are the most recent products and services you’ve brought to market?
We very recently released the OVH Managed Kubernetes Service, more info on that below.
Currently our new bare-metal product ranges have combined simplicity, high performance and versatility. We divided our bare-metal portfolio into four product ranges to keep up with customer demand for added simplicity, while retaining our fundamental values such as availability, flexibility and latest-generation components:
Whether it’s to adopt the cloud, support the growth of your business or meet your critical business requirements, OVH’s four new bare-metal product ranges offer a wide spectrum of configurations that take advantage of the latest technological innovations. Customers therefore get optimised and customised solutions for their specific needs.
5. And what can we expect from OVH during the rest of 2019 and into 2020?
Come and visit us at the OVH Summit in Paris on 10th October 2019, to get the low down.
More information can be found here: https://summit.ovhcloud.com/en/
6. What advice would you give to end users as they evaluate the various cloud solutions available?
There is no one size fits all approach. Be sure to evaluate the workloads and individual use case to identify the best cloud platform. Run some benchmark tests to ensure that it meets your needs before migrating production workloads. Although common knowledge, you would be surprised how many companies manage to overlook their disaster recovery needs. These are increasingly important for business continuity in our always online world.
7. OVH won the Open Source Innovation of the Year at the DCS Awards – can you tell us what this means to the company?
Wining this award is recognition for our tech teams that shed a lot of blood sweat and tears from design to implementation. To provide a Kubernetes service that is open source and compatible with any pure Kubernetes solution, without the hassle of installation or operation. A true demonstration of innovation which is at the heart of OVH!
8. And can you talk us through the winning entry - The managed Kubernetes service offering that was awaited by the community, based on open-source standards and predictable pricing?
There is no vendor lock-in. Businesses are looking to cloud technology not only for infrastructure but as a platform. They require a Kubernetes service that is open source and compatible with any pure Kubernetes solution, without the hassle of installation or operation. Designed in a cloud-native way, OVH’s Managed Kubernetes Service does just that, and operates on an open-source basis.
9. OVH has announced the availability of a ready-to-use Managed Kubernetes Service offering on its Public Cloud, designed to make it easier for all its customers to use Kubernetes within its infrastructure?
Characterised by its ease of deployment, resilient and scalable applications, this service enables OVH’s customers to focus on their core business. Kubernetes is currently used by tens of thousands of IT teams around the world to manage containerised workloads, from development to production. A recent survey by the Cloud Native Computing Foundation (CNCF) showed that Kubernetes is today the leading container management tool, used by 83% of organisations.
A managed Kubernetes service developed with the help of end-users In order to enhance its cloud portfolio and provide its customers – whether DevOps organisations, systems administrators or IT decision makers – with a complete Kubernetes experience without the hassle associated with software and infrastructure maintenance, OVH launched a Private Beta program on 18th October 2018. Following enthusiastic feedback from the first testing customers, OVH has expanded its Managed Kubernetes Service offering, which thus also includes the following:
Closest to the open-source spirit of Kubernetes, Kubernetes’ promises include providing a common standard among hybrid cloud and multi-cloud service providers. As one of the few CNCF-certified vendors in Europe, OVH has also decided to implement a Kubernetes alternative to existing offerings, so as to ensure freedom of choice, reversibility and transparency for users.
OVH Managed Kubernetes Service therefore stays true to open standards while at the same time leveraging the benefits of OVH Public Cloud, including its excellent performance/price ratio, to enable customers to:
"At Saagie, we edit an orchestrator for datalabs that uses Kubernetes in an advanced way, we have tested managed Kubernetes services at other providers. OVH's Managed Kubernetes® Service solution is based on standards, which has given us a very good portability experience! " Youen Chéné, CTO at Saagie.
"We had already tried to set up our Kubernetes cluster internally, but we couldn't do it completely, and we found it too complex to install and maintain. OVH's Managed Kubernetes Service offer allowed us to migrate our applications to Kubernetes without having to worry about installing and maintaining the platform. The beta phase of the Kube project went very smoothly thanks to the presence and reactivity of the OVH teams." Vincent Davy, DevOps at ITK.
"OVH's Managed Kubernetes Service offer provides all the performance necessary for the smooth running of our services, and we are sure to have no surprises on the invoice." Jérôme Balducci, CTO at Whoz.
10. And what are the forthcoming developments of Managed Kubernetes Service?
Already available to all OVH’s customers from the Gravelines data centre (France), Managed Kubernetes Service will be gradually deployed into the various OVH Public Cloud regions worldwide over the coming months. This extended geographic availability will reduce latency and increase the performance provided to customers. Managed Kubernetes Service will also be soon compatible with the vRack technology, enabling OVH customers to:
DW talks to Mike West, Aqua Group Data Centre Projects Manager, about the company’s recent win at the DCS Awards, focusing on the winning data centre efficiency project, as well as plans for the future.
Please can you give us a bit of background on the company?
Aqua was founded in 2001, with the aim of designing, manufacturing and installing innovative and energy efficient process cooling systems. From day one we worked across various industry sectors, on a global scale. These include food and beverage, packaging, plastics, power generation, education & research, medical, pharmaceutical & chemical and manufacturing.
The development of our unique, patented Leak Prevention System (LPS), propelled us into the data centre arena just over a decade ago. The first commercial LPS installation took place in 2010 and by 2011/12 the data centre industry made up 33% of our annual turnover.
Our business is built on our core values – technical excellence, care, integrity and a “can do” attitude. We train all our people to be specialists in what they do – innovative engineers dedicated to providing practical, effective and efficient solutions, supported by second-to-none customer service and aftersales.
Today, Aqua Group consists of 3 divisions – capital sales, hire and service – employing over 50 staff across the UK. Our head office is in Fareham, Hampshire and we also have offices in Surrey and Leeds and operate worldwide.
Specifically, what makes Aqua Group stand out in the data centre market?
Our USP as a business is our ability to offer complete system design. We are effectively a “one stop shop” for our clients, we don’t just orchestrate off the shelf products. When you deal with Aqua Group you get a bespoke, tailored system – from initial design, 3D modelling and energy calculations through to HAZOPs, full project management and delivery.
In addition, when it comes to energy calculations, we are completely transparent and pride ourselves on living up to our predicted savings. Our customers get given full, and part load, seasonally adjusted figures which are based on real situations. If we offer a data centre client an annual energy and cost saving then that’s what they’ll get, each and every year.
Fundamentally we come from a process background, so our thinking is perhaps very different to traditional data centre suppliers. For example, we’re really comfortable working with natural source cooling options, using say river or sea water. Process cooling is at the heart of everything we do, and I think this gives us a real edge when working with data centre clients.
Can you give us an overview of the company’s data centre products and services portfolio?
At Aqua we have a mantra of “out the box” thinking. Unlike other manufacturers and service providers, we don’t offer our clients a boxed product. We start with our client’s base needs and requirements and then design a system with whichever products are going to achieve the best advantage. We’re solution based and will work from initial design through to commissioning and aftersales support. So, for one application the best product might be a high efficiency free cooler, or a “chiller less” hybrid cooler, or perhaps CRACs, or an on-chip/immersed cooling product. We work with end user clients on low to super high density solutions so deploy a huge array of products.
Personally, our Leak Prevention System (LPS) will always be a personal favourite because of my involvement during the R&D stage, it’s been fantastic seeing a product go from concept to commercial realisation. Winning a Queen’s Award for Enterprise: Innovation in 2015 was a real highlight for me. It’s such an accolade and really demonstrates Aqua’s innovation and technical excellence. The LPS enables data centres to benefit from all the efficiencies of water-cooled technology but with guaranteed leak free operation. The system prevents leaks – it’s not a leak detection device – and is a real game changer.
And can you tell us something about the new cloud-based monitoring system?
This technology is highly innovative and, in a nutshell, allows the data centre user to have complete visibility of their cooling system, even if they’re not there in person. Key performance indicators can be set and managed in real time, and if equipment should work outside of the set parameters an alert is sent before a problem can occur. The premise is simple, a hardware platform is installed on site, capable of communicating with a multitude of different equipment. You then simply need a mobile link – or hardwired internet connection – to link to the cloud to retrieve and download the information. The system also interfaces with other types of equipment, and from different manufacturers, essentially anything that has a controller able to communicate with a Building Management System (BMS).
What words of advice would you give to people looking to upgrade or replace their existing data centre cooling infrastructure? What are the issues they need to consider?
A key industry issue currently is carbon footprint. There’s lots of companies within the marketplace that will offer a “box” product from their catalogue, or consultants that will promote their “tried & tested” method. But you just need to be mindful that this won’t necessarily give you the optimum efficiency and energy/cost saving. You also need to consider total cost of ownership. So, for clients working with Aqua Group, we’ll factor into our figures not just the electrical cost, but other factors such as water usage - which is also now a major concern in adiabatic systems - and ongoing maintenance, so that you get a true energy saving. If we recommend a particular system, there will be transparent justification as to why. It’s also important to always stay true to the needs & wants of the end user client. Our engineers will assess each option on its own merits but balance this with the customer’s real-life requirements. You want the best solution for your individual scenario.
Can you tell us what it means to win a DCS Award – the Data Centre Efficiency Project of the Year?
We were absolutely delighted to win a DCS Award, being a finalist was an achievement but to win really was the icing on the cake. It was great recognition for everyone involved on the project, both from Aqua and our client 4D. Their Gatwick facility was a great project to be a part of and showcases our commitment to improving our environment by reducing energy consumption and substantially reducing CO2 emissions which impact global warming. Not only with large scale projects like this one, but every time we install energy efficient chillers, free coolers, pump inverters, and so on, we’re doing something positive and so important for our future.
What was the driving force behind the winning project – what business or technology challenge needed to be addressed?
End user client, 4D, are a UK base colocation provider, providing ultra-fast connectivity and business focused cloud services to organisations around the World. When 4D took over a low-density data centre building at Gatwick, Aqua were asked to upgrade the existing chiller-based cooling system. The project ran in 2018, with installation and commissioning taking place whilst the facility was live. The project design focused on a bespoke solution with energy efficiency, system resilience and reliability at its heart. It was critical that it was completely scalable, future proofing the client’s business model, allowing them to scale up to high density capacity at 40kW per rack when market demand required.
How did the solution address the challenges and were there any particularly innovative aspects that made it stand out?
The legacy chillers were replaced with a “chiller less” solution, using a cooling tower alongside a packaged plant room and full water treatment facility. The design included pumps, heat exchangers, dosing equipment, filtration, inverters and remote monitoring. Specially designed CRAC units which operate on a 2°C difference between air and water temperatures, enable the system to run at higher temperatures. This eliminates the need for compressors – or mechanical cooling – allowing the chillers to be replaced with natural source cooling (cooling towers). This also reduces the number of moving parts within the design which automatically increases system reliability. In addition, using Carel’s, CPco Platform, the system’s control strategy was engineered to minimise energy usage.
The monitoring technology is aware of the minimal energy requirements for each piece of equipment within the system, ensuring it runs at just the minimal necessary, with zero energy wastage.
A bespoke square pipework system was designed with dual - or 2N - redundancy, so there is no single point of failure. If an issue should ever arise it can be resolved without interruption or downtime in the data centre.
As far as future proofing and scalability goes, the first stage of the project used just one single tower, but the project design will take five further towers, giving huge potential for scaling up. Each plant room and water treatment facility are modular, located within a container, so additional resource can be simply added as required.
What major challenges were faced during the project and how were they overcome?
The single biggest challenge of the project was installing and commissioning within a live data centre environment. Project Engineers restricted the work carried out within the live facility to a minimum. For example, pipework was 3D modelled so it could be pre-manufactured off site and then bought in for final assembly. Final commissioning had to be carefully phased, to avoid any unnecessary swings in supply temperature. The design of the legacy facility also caused challenges. For example, the existing raised floor with cable trays in situ meant pipework had to be carefully routed and specially designed in areas as there were limited places to cross over.
What tangible benefits has the organisation seen as a result of the project’s implementation?
Energy efficiency and subsequent cost savings were immediately evident, consuming significantly less than the previous system. The annual energy saving compared to the old system is forecast at 90%+, with an annualised part load PUE of 1.125. For 4D’s business, they have a facility that they can now scale right up to high density capacity of 40kW per rack.
Benefits achieved included:
Finally, what can we expect from Aqua Cooling in the future?
More DCS Awards?! On a serious note, things are looking really exciting for Aqua Group over the next few years. Our growth plans as a business over the next 5 years, combined with some exciting projects in the pipeline, could see us as one of the largest data centre cooling designers in Europe. We’re already working on multi location projects across Europe for the hyperscale market, as well as working on state of the art on chip and immersed server cooling solutions. There are also exciting things ahead with regard to recycled carbon – so repurposing carbon emissions, taking them from a waste stream and using them for another application – so watch this space!
Schneider Electric is developing an integrated, comprehensive edge ecosystem designed to provide key partners, the Channel and end users with the tools they need to take advantage of the opportunities offered by this key digital transformation building block. DW reports in the first of a two part article.
A visit to an ice hockey game is compulsory on a trip to the US with Schneider Electric. While the Boston Bruins might not have a chant to rival the ‘Let’s Go Blues!’ slogan of the St.Louis ice hockey team (watched three years ago), it would appear that their play, at least right now, is rather better. At the time of writing, the Bruins were still involved in the play-offs – no doubt inspired by our end of season visit which saw them despatch the New York Rangers in disdainful fashion.
Lest readers think they’ve unwittingly stumbled into some kind of a half-baked sports blog, let me assure them that the Bruins and ice hockey are very relevant to the subject matter of this article – the edge. Entry to the game was via a digital ticket, downloaded via a mobile phone app. And, since my return from Boston, I’ve received three or four marketing emails giving me details of how to obtain tickets for various play-off games and, most recently, I’ve been invited to participate in a Boston Bruins survey.
No doubt if I’d bothered to reads the Ts & Cs when I signed up to the ticketing app, I would have read that I was agreeing to allow all and sundry to contact me with ‘relevant’ offers, and maybe there would have been an opt out tick box, but hey, life’s too short, right?
And here, in a nutshell, is everything that’s good and bad about the possibilities of the edge, illustrating how what we might call Edge 1.0 needs to develop into a rather more sophisticated solution that can, within reason, understand enough about the end user so that he or she is not bothered with irrelevant offers, but only those which will be of interest. Clearly, however intelligent the solution, there will always be anomalies which slip through the system, but in the case of the Boston Bruins, it shouldn’t be impossible to quickly discover that I’m not based anywhere near Boston and that my visit to the game was a one-off, hence I’m not of much use to them in the future in terms of marketing.
There again, one could argue that, so much did I enjoy the ice hockey, if I was emailed with a great offer for a long weekend in Boston to include a ticket to the Bruins, I might just be interested. So, obtaining sophisticated data, to understand the end user better, is an important part of the edge opportunity.
And having firmly put the cart before the horse in starting this article, let’s return to the beginning and a fascinating two days of presentations and information sharing courtesy of Schneider Electric and partners.
Life at the edge
By 2025, 75 percent of enterprise-generated data will be created and processed the edge.
GartnerThe hype around what might be termed ‘edge potential’ has been huge. Seen as a major part of the digital transformation process which virtually every business across the planet knows that it needs to undertake in some way or other, the edge serves a twofold purpose.
Firstly, it will be obtaining data from millions of endpoints – humans and machines – wherever they are located. The data obtained from these endpoints may be processed locally, and the information obtained will be used to interact with the endpoints. For example, a consumer’s mobile phone might alert a certain store that he or she is nearby, and the store’s local IT, housed in a micro data centre, will process this information and send some kind of an offer to the shopper: ‘Visit us now and get 25 percent discount/come to the store for your free mystery gift etc.
The same data may also head back to a centralised data centre for more long term, strategic planning and customer relationship management.
Secondly, the edge will be responsible for bringing all manner of content – data, voice, images and video – closer to the user. For example, a new game is launched. Rather than have every gamer try and access the game from the same, centralised data centre facility, why not move the content much closer to the user – enhancing the gaming experience – little or no latency, hence much better performance?
Right now, there are, perhaps, only two limitations to the development of the edge. One is the human imagination. Plenty of businesses have yet to understand the strategic importance of the edge. Once they do, edge applications will proliferate. The other limitation, and the significant one, is the lack of edge infrastructure.
As Dave Johnson, Executive VP of the Secure Power Division at Schneider Electric, explains: “Edge computing opportunities are huge, across the commercial, industrial and telco sectors but, right now, there are some significant challenges to overcome. Most businesses suffer from a lack of true resiliency across their IT environment, have little or no infrastructure remote monitoring and management, continue to struggle with a lack of IT standardisation and integration, and many sites have few, if any, trained IT staff.”
Schneider Electric’s edge ‘mission’ is to address these pain points, both with its existing technology portfolio and with products and services that will be developed in the future.
For resiliency, the company has integrated micro data centres, row data centres and modular, all-in-one data centres that incorporate security, power, cooling and enclosures, all of which can be controlled cloud-based management, via EcoStruxure IT.
The EcoStruxure IT architecture provides remote visibility and data-driven insights to help optimise operations. Remote visibility makes it possible to manage all of an organisation’s data centre sites from a single device; while big data analytics offer the possibility of identifying significant trends within a data centre and/or predicting failures. This means that more and more knowledge about how the data centre(s) perform is obtained, helping with longer term planning and management. And failure predictions allow for preventative maintenance to be undertaken.
In terms of standardisation and integration, Schneider Electric is aware that no one organisation can provide a complete edge solution for customers. Hence the company’s decision to develop and grow an ecosystem of partners, with a major focus being interoperability.
Dave Johnson quotes a World Wide Technology report: “The ability to pre-configure technology platforms and devices before shipment increases deployment speed and can reduce field engineering costs by between 25 and 40 percent, increase order processing speed by 20 percent and reduce maintenance costs by seven percent.”
As for helping businesses address the issue of lack of IT staff at many sites? Well, the combination of resilient data centre facilities, remote monitoring and management, plus the supply of pre-configured, integrated solutions, means that much, if not all, of the responsibility for the IT and supporting data centre infrastructure is taken away from the customer.
A new way of thinking is required
Following on from Dave Johnson, Schneider Electric’s SVP of Innovation and CTO, Kevin Brown, focused on the issue of edge resilience, pointing out that, at the current time, the large, centralised, mission-critical data centres receive all the attention when it comes to guaranteeing uptime, whereas the smaller, local facilities are seen to be not as important. However, if the promise of the edge is to be delivered, then edge data centres need to be viewed as just as mission-critical as a company’s main data centre site(s).
As Kevin explains: “The industry knows how to make reliable, large, consolidate or centralised data centres, but the importance of the local edge has been overlooked. Today’s digital natives expect the same level of application performance no matter where they are or how they are connected. For a local data centre, this means secure racks, redundancy, dedicated cooling, monitoring and management and local, expert staff – the same requirements as for a major data centre.”
Kevin identifies three key areas for improvement in order to make the edge resilient: an integrated ecosystem, management tools and analytics and AI to augment staff.
In terms of an integrated ecosystem, Kevin believes that ‘customers need to be better at defining their requirements’. In other words, the customer needs to understand how best to leverage a collaborative ecosystem consisting of physical infrastructure vendors, IT equipment manufacturers, systems integrators and managed services providers. And these four technology provider categories need to collaborate, whether independently of, or in partnership with, their customers.
One simple example – at the local edge, it’s less than ideal if the IT and facilities hardware arrive separately. Much better they arrive together as an integrated solution.
Schneider Electric is already working with key IT vendors to integrated solutions a reality. These partners include HP Enterprise, Scale Computing, Microsoft, Cisco, NetApp, DellEMC and StorMagic. Most recently, the company’s partnership with Cisco has borne fruit in the form of new solutions for micro data centres that couple APC by Schneider Electricphysical infrastructure with Cisco’s HyperFlex edge hyperconverged infrastructuresolutions for ‘quick and efficient’ deployment in edge environments.
Together, Schneider Electric and Cisco can now offer IT global channel partners and system integrators and, of course, their customers, access to several new reference designs for HyperFlex deployments that can be used as-is or be customized to meet specific micro data centre needs. These solutions have been pre-engineered to seamlessly join APC and Cisco equipment for solutions that are pre-integrated, remotely monitorable, and physically secure.
“For IT channel partners and system integrators, a fully integrated micro data centre solution from Schneider Electric and Cisco saves valuable rack-and-stack floor space and time, and these reference designs provide peace of mind that they will be getting a fully optimised solution,” says John Knorr, VP, Global IT Channel Alliances, Schneider Electric. “We’re fully dedicated to the relationship with Cisco and offering the latest innovative solutions to our customers.”
This new offer is part of Schneider Electric and Cisco’s commitment to delivering world-class edge and IOT solutions that offer the highest level of flexibility, resiliency, and speedy deployment.
“Cisco looks forward to more collaboration with Schneider Electric,” says Vijay Venugopal, Sr. Director, HyperFlex Product Management, Cisco. “As the needs for edge compute continue to evolve and the marketplace demands plug-and-play solutions that put the specific needs of the customer front and centre, new solutions like this micro data centre solution with Cisco HyperFlex Edge will be key for success.”
Moving on to the management tools required to address edge computing requirements, Kevin says that conventional management tools are inadequate to deal with this new environment. The traditional approach means that each device is managed separately and requires its own IP address; whereas what is needed is that each edge site should be managed as a complete micro data centre. One dashboard to manage all components as a single system at a given edge site.
Schneider Electric believes that, in order to obtain the best performance from such new management tools, they need to be cloud-based. This should mean that they are easy to get started, can be accessed anywhere at any time, can be scaled up as required (pay as you grow), are maintenance-free and feature automatic software updates and backups, along with up to date security.
Finally, comes the AI and analytics required to augment local staff. If defining the edge is complicated (Schneider believes in the centralised, regional edge, local edge model), how much more so developing a successful AI implementation strategy?! However, Kevin identifies four key ingredients: a secure, scalable, robust cloud architecture; a data lake with massive amounts of normalised data; a talent pool of subject matter experts with deep knowledge of system behaviour; and access to machine learning algorithm expertise.
In practical terms, Kevin uses the example of analysing the thousands of alerts received from across a data centre estate and turning them into an infrastructure health scorecard, which prioritises what actions need to be taken by when. The benefits of such an approach (requiring sophisticated AI/machine learning to understand normal data centre behaviour and hence to identify events outside of this norm) include a massive reduction in the hours spent manually evaluating alarms, the avoidance of downtime and peace of mind for those responsible for both running and using the data centre and IT infrastructure.
The Schneider solution
Schneider Electric’s EcoStruxure IT software and services have been designed to address many of the issues and pain points associated with edge computing and the wider role of data centre management. EcoStruxure IT is a cloud-based DCIM solution designed to simplify infrastructure management when compared to traditional DCIM and is optimised to manage the edge. EcoStruxure IT provides infrastructure visibility and analytics-based, actionable insights across the hybrid IT environment. Key features include: centralised, real-time device monitoring and alarm consolidation; health status across the entire environment; access to infrastructure data regardless of location and time; analytics, industry benchmarks and actionable recommendations; and mass device configuration and firmware updates.
In keeping with Schneider Electric’s collaboration message, various versions of EcoStruxure IT are available, allowing end users to go it alone, allowing a preferred partner to operate the software or allowing Schneider Electric to do the hard work – or any combination of the three. Most recently, EcoStruxure IT for the Channel has been introduced, allowing resellers, systems integrators and the like to offer a managed service to their customers.
In finishing his presentation, Kevin gave a couple of examples of how EcoStruxure IT is already helping real-world customers address real problems. A hospital system with 51 hospitals and 100s of clinics spread across five US states had no enterprise monitoring tool for its 2000 or so devices and, therefore, no visibility across disparate networks. Deploying EcoStruxure IT has provided one consolidate view across the networks, easy to deploy, remote device configuration and firmware upgrades and alarm consolidation which has led to increased productivity.
At a higher education university, with 20,000 students, the challenge was managing 400 devices in closets across the campus, alongside 210 devices in the data centre. The challenges faced by the university were no way of managing the local closets, the need for a centralised, consolidated view of the physical infrastructure, as well as mobile visibility.
In action, the Schneider Electric solutions means that the director of facilities receives alerts on a mobile device while at home, calls the NOC for an update – but the NOC hadn’t realised there was a problem!
5G and the telco edge
Steve Carlini, Schneider Electric’s Vice President Innovation, CTO Office, offered some fascinating thoughts on the role that the telecoms sector has to play in building out edge infrastructure and how this might impact on data centre architecture. The key takeaway message from Steve’s presentation, and a subsequent offline discussion, seems to be that the future of 5G and wireless more generally is anything but predictable. This poses something of a challenge for end users gearing up for the arrival of this technology as there’s no obvious, universally agreed roadmap for the introduction of 5G, let alone what might be done to leverage/extend the life of the existing 4G, ‘4 ½’G and LTE infrastructure. Cost is a major consideration as 5G requires a whole new, software-defined infrastructure to be developed. The telecoms industry is not exactly awash with money right now, so there must be some debate as to how or when the telco vendors will commit to a wholescale 5G roll out schedule. The more so, when one considers that there are plenty of areas where obtaining a 4G signal, and sometimes even a 3G one, can be challenging.
Perhaps better placed to embark on the necessary 5G infrastructure build are some of the large IT vendors or, most likely, the hyperscalers? After all, it is their applications and user ecosystems that are driving much of the demand for faster, smarter content delivery. So, who better to do the build out, especially given their healthy balance sheets?
So, the 5G picture is distinctly unclear in terms of when it will become mainstream.
However, what’s already clear is the potential they offer. 5G networks can operate at up to 10 times faster than 4G networks, with one millisecond latency as opposed to 40 or 50 milliseconds with 4G. 5G networks also have greater capacity than comparable 4G infrastructure – as much as 1000 times more. Just for good measure, 5G networks are also promised to be more reliable.
What’s not to like?!
Equally clear is that 5G is a key enabler of edge computing applications. And, while there are the inevitable debates over 5G standards and the like, it’s also reasonably clear as to what 5G infrastructure is going to look like.
Software-defined networks are key, as the networks will be sliced to match the delivery method. High bandwidth will be used for wireless broadband; ultra-low latency will be required for real-time control; low energy and low bandwidth will address the IOT/sensor market; and ultra-high bandwidth will be needed for video streaming applications.
A distributed cloud architecture will provide the optimum infrastructure for 5G deployments – comprising a central core cloud, regional or metro edge core cloud and local edge core clouds providing the radio access networks (RANs) which will deliver the quality of service required for a whole host of 5G applications. These will include everything from utilities, smart cars, traffic management, emergency services, and hospitals to agriculture, industrial and stadia, for example. The local clusters are crucial to ensure that the latency target for any specific application are achievable.
Also important to the success of 5G deployments are network function virtualisation (NFV) – an open platform enabling software controlled functions, running on commodity servers; and multi-access edge computing (MEC) – bringing content delivery and compute functions, combined with telecom functions such as call routing and automation, closer to users.
The Schneider portfolio already plays well in the central core cloud, regional/edge core cloud and local edge core cloud (base station/rooftop/parking lot etc. locations). And the company has plans to further develop its technology offering specifically for the local edge.
For example, Schneider Electric has collaborated with Lenovo on a local edge core micro data centre, and has worked with Edge Micro on a local edge core prefabricated data centre.
In summarising the 5G opportunity, Steve Carlini explained how the requirement for content to be cached and/or processed at the regional/metro and local edges means many more data centres will need to be built. And Schneider Electric intends to be a key player in this new, edge data centre market.
How has GDPR changed the security and compliance landscape? Over the following pages, you’ll find a range of views and opinions as to what’s changed, for better and worse! Here, we start with a GDPR Q and A with Mark Thompson, Global Privacy Lead, KPMG.
“Reportedly, regulators have received 64,000 data-breach notifications from across the EEA since the General Data Protection Regulation came into effect[1]. With hundreds of investigations currently in progress we are slowly starting to see substantial enforcement and fines as a result of non-GDPR compliance. This shows that organisations still have a long way to go in placing privacy needs at the top of their priorities and at the centre of their operations.
“We’ve seen organisations get burned for thinking GDPR is an umbrella term that captures all privacy regulations. In truth, GDPR is just one example of hundreds of privacy regulations operating globally. A lot of companies have implemented it and assumed they’re compliant as a result, but this is not the case as GDPR isn’t recognised in all overseas markets. Looking forward, businesses need to think differently, and more holistically, about privacy.
“There’s also a need for more board level accountability when it comes to data management in businesses. Privacy was definitely high on the board level agenda last May, but it has since slipped down the priority list. It needs to be considered like any other critical asset and be consistently thought of as a priority at board level.”
“In the digital economy, companies know more about their customers than ever before. Manufacturers, retailers and platform companies are already unlocking the value of data by configuring quicker, easier and more personalised experiences to win, retain and build trust with customers. Yet this value will not endure if companies fail to understand what consumers think about their data, how it is used and who they should trust to protect it. In this changing landscape, companies need to look beyond such concepts as permissions and consent and recognise that data privacy is far more than a compliance-led, box-ticking exercise. Data is an asset that, mishandled, can become a liability that damages your brand and destroys trust.
“In the next year, we anticipate organisations are going to go into ‘phase two’, where they’ll look to make privacy processes more efficient, operationally effective, leverage technology and truly provide the right foundations putting the customer firmly at the heart of how they approach privacy. If done right this will truly enable organisations to leverage personal information to deliver great products and services and create value and give them a competitive edge.
“This next step is to re-evaluate the dynamic of data as an asset vs data as a liability and to put in place the right resources, structure and budget, to support one of the organisations most valuable assets.
“This will become increasingly important in the coming months as regulators begin to levy more enforcement actions. We are also likely to see more emphasis being placed on more complex privacy issues, such as international transfers and data subject rights.”
“One of the most widely reported issue businesses are still facing with GDPR is understanding what data they have on record and how it is used. We recently worked with a client whose business operates across 5,000 systems. Establishing what data is stored amongst thousands of systems and how to achieve compliance can be a difficult task. We would not recommend carrying out a project of that scale manually, however we know that organisations have generally carried out their “Data inventory projects” in a manual way and these are fast becoming out of date and unusable.
“Another challenge for companies is how to manage numerous data interactions across their enterprise cycle. We are seeing data from a single consumer interaction move across many different organisations and getting a handle on those touch points and the different uses of data can be a complex task. We are also expecting the number of touch points to increase tenfold over the next four to five years. Digital advances have led to a significant transformation in how much data is used and how it gets passed down the value chain.
“In addition to this, companies also often find it challenging to understand what the consumer expects in terms of data protection and getting the balance right. Research shows that personalisation is a key factor in building customer trust - the more businesses can behave like the customers they serve, the more the trust grows. However, personalisation is generally only possible with a large amount of customer data, and the only way to obtain customer data is by building trust.
“A further key challenge is the ambiguity around how GDPR principles can be interpreted. The regulators have provided guidance, but organisations have had to consult lawyers and make a decision as to what they think the right position is. Some organisations are being very risk adverse while others are interpreting the requirements a lot more broadly. We will have more clarity on the regulatory ‘grey areas’ when we start seeing case law and enforcement actions being issued in the next few months.
“All these challenges have been underpinned by a significant lack of technology to support GDPR – privacy tech is limited in the marketplace and most of the available technology is being delivered by start-ups. There hasn’t yet been a solution or a group of solutions that can be easily bolted onto a business’ current technology infrastructure, which remains a challenge for companies looking to implement long-term change.”
“If you look at global businesses, some have spent over a £100million on implementing GDPR, while others spent less than £1million. Looking at a company’s overall investment is interesting as it shows how much a business is taking the data asset vs data as a liability dynamic seriously and whether it’s genuinely looking to transform for the future.
“Data mapping is one of the most significant costs reported by businesses – this includes understanding where the data is kept and how it’s being used. As this is still often being done in a manual fashion, it also requires a lot of resource.
“Human capital resources, including hiring lawyers and contractors, has also been a common expense for companies. The review of large volumes of contracts, to make sure they contain the right clauses and are fit for purpose from a regulatory perspective, has been an additional source of expense.
“There has also been a high cost associated with hiring resource internally. In many businesses, privacy was traditionally dealt with by one person or a very small team, and it now often involves a wider team, with multi-disciplinary skills who operate in a small resource market place, which represents a big step change for companies in terms of cost.
“It’s also worth noting that we’ve seen companies lose money by shoehorning privacy into business processes and customer interaction points inefficiently. For example, a customer can access a website and have to click through numerous cookie settings to access the information they want. This is an example of how some businesses have applied privacy processes but have impacted the customer journey in the process. This is costing businesses from a revenue perspective, and from a customer experience and trust perspective.”
As we approach the one-year anniversary of GDPR, this new regulation - not just in the UK but globally - has forced organisations to pay much closer attention to their intake, use and storage of data. GDPR grants consumers the ‘right to erasure’ and with thousands of companies using in-house data centres, it’s clear that there needs to be a monumental shift in maintaining best practice in data storage. What better time than spring, for a clean-up?
By Fredrik Forslund, VP Enterprise and Cloud Erasure Solutions, Blancco.
Organisations are making a mistake in choosing to store data in-house to comply with regulation. A lack of education has left people paralysed by fear over security concerns. Data breach scandals have littered the headlines in recent months and the UK competition regulator is reframing the antitrust policy for the tech sector. It is well documented that breaches due to loss of data not only cost businesses financially, but also cause serious damage to brand reputation. Ironically, leaving residual data in data centres to save money costs more than the erasure process itself.
The common pitfalls of compliance
Data management should be an ongoing concern when both the risks of data breaches – and hidden costs are considered. Education around the right methods of data sanitisation should be a priority for organisations, but there are a few common mistakes made by organisations globally.
Physical destruction - such as shredding, grinding and degaussing - are common methods of data sanitisation, but they are not fully effective. The increase in popularity of SSD drives is particularly concerning, as data has been found on this type of drive after the physical destruction process. There are also the environmental concerns that come with this approach. Furthermore, physically destroyed assets often end up in landfill, which is a waste, because many drives contain precious metals which could have been recycled for other purposes. Today, we’re seeing Industries and countries lead the way in this space, by looking at new ways to repurpose these precious metals. Tokyo, for example, recently announced that all medals for the 2020 Olympics will be made from metals recycled from electrical components. We should not just be concerned about saving money, but also about saving the environment.
Many data centres run into problems dealing with Return Material Authorization (RMA) hard drives, too. Difficulties arise when agreements to return drives to the manufacturer are breached, because these drives may contain sensitive customer and business data when leaving the premises. As a result, enterprises are holding onto them to protect the data and as a result penalty costs are being activated from the storage suppliers. Drives containing residual data that are wasting precious space have the potential to be reused, or even returned to the manufacturer to be refunded, if properly sanitised. The potential to cut down on costs is immense.
Cryptographic erasure – the process where data is encrypted before it’s erased – is another popular method of sanitisation. However, it’s subject to human error. Unless keys are stored and managed securely, they are vulnerable to attack. Furthermore, encryption has a shelf life. Cryptography is advancing at a pace – and algorithms that were once considered strong can be easily broken.
Because of the challenges associated with these methods, storing residual drives onsite often seems like an easy option. To add further pressure, there is a lack of resource for organisations to comply with new regulation and following best practice can simply be too onerous. Furthermore, it can be difficult to automate the erasure process, so most organisations sanitise one drive at a time, incurring even higher costs because of the labour it takes for the sanitisation process.
A recent survey revealed that of the IT and data protection professional respondents, 15 per cent reported that they lack funds for their GDPR plans and 20 per cent simply don’t have the time to focus on it. But with administration costs building over time, data centre owners need to adopt the best possible data management and erasure processes, before it’s too late.
Best practice spring cleaning
So, how does one approach spring cleaning a data centre? The approach must be comprehensive as the entire asset lifecycle needs to be considered from purchase to disposal, with regulation-conforming processes put into place throughout. To ensure absolute confidence, all assets should be securely erased from data before leaving the data centre. It is crucial that the sanitisation process includes an auditable and digitally-signed report which proves that the data has been fully erased. Time should be taken to find the most cost-efficient solutions, so where possible, drives should be sanitised and re-used. And most importantly, if entire servers, racks of servers or even SANs are repurposed or decommissioned, the drives should be erased in the system to save staff from removing and handling loose assets unnecessarily.
Data centres are growing in both size and capability, increasing the need for greater awareness of data erasure best practice. Complying with data protection regulation can result in a cluttered data centre, causing unnecessary costs to build. It’s time for education and action. Data sanitisation is a complex process and can cause concerns over privacy when best practice is not followed. The erasure process should be entrusted to an industry certified vendor. Not only will they be compliant with new regulation, but they can also offer expertise and efficient processes that will be key to success. Organisations need to act fast and start the clean-up process. Privacy concerns will be relieved, costs will be reduced – and the environment will be much better for it.
May 2019 marks the first anniversary of the General Data Protection Regulation (GDPR), and early numbers make clear that its implementation has been a success as a breach notification law. As such, GDPR has affected multiple aspects of a business. It has created increased requirements for businesses to deal with issues such as security, compliance, data ownership, training and data management. The new regulation will require, for many of businesses, a fundamental change to their internal processes and ongoing focus on compliance.
By Frank Krieger, Vice-President, Governance, Risk and Compliance, iland.
There are several myths around who manages data inside an organisation which have been challenged as a result of GDPR regulations. From the shift from an IT-centric to a business process owner model, to educating internal teams and reviewing tools, here are the top five myths around management of data that GDPR effectively busted.
1. Data Management is an IT function
Data management used to be solely an IT function but, since GDPR came into force, organisations have been increasingly realising the criticality and value of their data assets. This is why the data management function has become a business and IT function. It requires a full commitment by every organisation to build data protection into its culture and all aspects of its operations, from support through accounting to product development. The GDPR is not specific to just IT, it must permeate all aspects of the organisation to ensure a culture of data privacy is built.
2. Business organisations have always been familiar with data management
Since the new regulation made data management a business, not just an IT, concern, awareness around GDPR needed to be expanded to different departments in an organisation. Many parts of business organisations were not familiar with data management and had to be trained and managed around the issue. However, a recent paper by Osterman Research showed that only 42 per cent of organisations have trained their employees around data management and GDPR, meaning that 58 per cent left their employees in the dark.
3. All departments understand how to manage and control data
As mentioned above, data management used to be exclusively an IT function and IT teams had a good understanding of the way data should be managed and control. Those in business functions tended to accumulate data and lacked access control, putting at that data at risk. Today, the responsibility for compliance is shared across the different functions. Non-IT employees cannot simply close their eyes to the risks they take when handling their company’s data. Raising awareness is crucial to prevent data breaches and impacts on the organisation’s finances and reputation.
4. GDPR isn’t relevant for everyone
Departments have been affected in different ways and to different degrees: some have been living and breathing the regulation for several years, for others it may be new. But being data protection-aware is no longer optional, it’s critical and regulated. An ongoing continuous programme of education – from induction through regular refresher sessions – is essential. This helps make data awareness relevant for everyone from the Chairman of the Board to the customer service team and beyond.
5. Data protection stops at the organisation’s perimeter
Suddenly, businesses realised that they were responsible not just for their own data protection compliance, but that of all the links in their supply chain. Cloud computing is a case in point where IT and business managers realised that their CSP needed to be just as compliant as they were in order to avoid a huge security gap. From client-supplier, the relationship shifted to that of a collaborative security partnership as the degree of trust and diligence needed between parties escalated.
From myth to reality
Overall, the understanding of the value and risks around personal data had to be propagated through organisations and actively monitored. GDPR didn’t act as a reminder of what ought to be done, but instead as a proper new regulation. It has changed how organisations collect and manage data and personal information, busting the myth that data management lived in the IT department silo and making it relevant for everyone. That has required extensive investment in people and tools to oversee, and a re-evaluation of business relationships with suppliers and customers alike.
May 25th marked the one-year anniversary since the implementation of the GDPR. In the run up to this time last year there were numerous predictions on how the regulation would impact organisations across almost every sector, as businesses scrambled to prepare.
Essentially any organisation that holds European personal data must comply, whether a small start-up organisation or a multibillion-dollar enterprise. One of the major concerns around the GDPR was the high-stake repercussions, as businesses can face penalties of up to $10 million or 2% of their global turnover. But how many businesses have been fined in the past 12 months and how far has the GDPR shaken up the industry?
We spoke to several experts who shared their thoughts on what they have seen in the industry and arguably the associated fines have been smaller than what organisation anticipated. Adenike Cosgrove, Cybersecurity Strategist, International at Proofpoint explained: “Everyone expected big fines in the first year, and we have seen those affecting Google and Facebook, but not much else to date.”
In agreement, Colin Truran, Principal Technology Strategist at Quest added: “The total fines to date are around €56 million – which you would initially think is a lot, but actually, almost all of it comes from French data watchdog CNIL's €50m fine for Google.”
However, the GDPR has impacted the industry in other ways. The regulation is not designed to punish businesses but to hold them accountable and improve awareness around security and data. Cosgrove explains: “So far, 2019 has proven to be a transition year where the regulators are flagging concerns and companies are given a reasonably generous chance to fix issues. One example would be HMRC’s deadline of the 5th of June to delete five million VoiceID files it obtained without explicit consent.”
She continued: “Since the GDPR deadline this time last year, the number of breaches reported has gone up significantly. This is indicative of the fact that for the first time, European companies have to report on any breaches that impact the privacy of EU residents, and the regulation is surfacing the scale of the issue. What we are also seeing however is the regulators asking companies to provide them with detailed information on how data is processed and what controls they have in place. This is valuable time away from the business for companies who have no choice but to allocate resource to respond to the regulators. As part of this exercise, organisations have to secure their entire ecosystem to meet GDPR requirements. This therefore impacts smaller organisations that may be part of larger companies’ supply chain for example. We have seen plenty of cyber-attacks targeting supply chains so this is a critical aspect for companies to consider.”
Truran also explained how the GDPR has impacted the industry in other ways: “Whilst those of us in the technology industry have been discussing GDPR at length over the last 12 months, espousing the benefits of having defined compliance processes in place, the first few weeks of the implementation of GDPR saw the gravity of the situation become impossible to ignore. A tidal wave of privacy policy update emails hit the inboxes of practically every member of the EU public, prompting varying degrees of confusion, frustration, and mockery in the media. For better or for worse, we’re now in a position where every individual, regardless of technical savvy, is more aware than ever about their right to digital privacy and the level of control they have over their personally identifiable information (PII).”
“As more and more businesses are now looking to cover their backs and demonstrate varying degrees of compliance to their users, this new era of data privacy awareness could be more than many businesses bargained for when the inevitable Information Commissioner Officer (ICO) comes knocking.”
“However, GDPR has not yet had that real wake up call that many thought it would. The fines to date have been well within budget, not insignificant, but not exactly life changing either. There is also a clear discrepancy between how data authorities in countries are applying it, so despite having a common set of rules it is not a level playing field. With all that said, it is still early days where most of the breaches occurred before the GDPR was ratified into law. Therefore, this year will be the decider if GDPR is an effective solution as it was intended or just another piece of bureaucracy that fails to have the desired effect.”
From a security point of view, Ian Bancroft, Vice President and General Manager, EMEA at Secureworks also shared his thoughts: “One thing that has quickly become apparent is the complexity around GDPR. Over the past 12 months we have seen more customers seeking external expertise when it comes to security controls and best practises. Businesses have realised that for the majority, GDPR requires expertise, resources and understanding beyond internal capability. However, any regulation that puts security at the forefront of the business agenda is a good thing.”
Bancroft continued: “By holding organisations responsible, the regulation is reaffirming that businesses need to know their data, manage it, and build a strategy which protects every stakeholder from investors to the end user. Ultimately, regulations like GDPR are one of the key reasons behind the shifting role of traditionally non-strategic roles in the boardroom like the CFO, CTO and CSO. With the value of data growing exponentially, those who are directly responsible and impacted by data will increasingly find themselves consulted on how to use this asset effectively, and above all else, securely.”
Various industry experts offer their views and opinions as to what has been the impact of GDPR since its introduction a year ago.
David Kemp, business strategist, Security, Risk and Governance, Micro Focus, considers why companies can no longer afford to ignore the GDPR...
“On 25th May 2019, the world will note, rather than celebrate, the first anniversary of the EU GDPR. While its conception was timely and necessary given modern-day data explosion and the surge of social media, many businesses have been slow off the mark to comply with the regulation. On the one side, companies were not prepared and on the other side, it could not be suitably enforced due to the limited resources allocated to regulators. Unfortunately, in the run up to the GDPR’s one year milestone, surprisingly large corporations and government agencies are still at an early stage of compliance.
“However, despite the slow start we’ll begin to see a major drive from organisations to achieve data privacy compliance, not only within the EU but worldwide. This push will be a result of regulators now possessing sufficient manpower to achieve demonstrable enforcement – as we’ve already seen with Google’s £44m GDPR fine in January 2019. And parallel legislation in the form of the California Consumer Privacy Act 2018 coming into force in 2020, as well similar regulation across APAC, will undoubtedly increase pressure. In the UK, the Information Commissioner’s Office, as DPA, has further raised the stakes by now achieving jail sentences under the Computer Misuse Act – meaning that the risk of non-compliance is now a matter of deprivation of liberty, not just fines.
“As more sanctions and data privacy-focused headlines emerge, individuals will increasingly recognise their right to data privacy – and ability to hold businesses to account for negligence. Put simply, companies can no longer afford to ignore the GDPR or its parallel peer nation data privacy legislation.”
Matt Eckersall, Regional Director, EMEA West at SUSE, considers the role of data storage in GDPR compliance…
“Data privacy and data storage are intrinsically linked. While data privacy has always been a priority for storage providers, the GDPR has brought this into sharper focus over the last year or so. With the correct storage infrastructure in place, companies can achieve regulatory compliance and ensure customer trust in their brand is not damaged by either cybersecurity concerns or a lack of transparency around data storage and use.
“Both the GDPR and data explosion have resulted in an increasing business need for agile, cost-effective, scalable storage solutions which can help the organisation to grow, compete and survive – and achieve compliance with data protection regulation. While data privacy cannot be addressed with a single silver bullet, storage is a good place to start. One year after the GDPR came into force, businesses need to be considering their storage infrastructure as part of their compliance check process – or risk falling foul of the regulation.”
Simon Wood, Group CEO at Ubisecure, considers what lies ahead for the GDPR…
“The implementation of the GDPR saw the introduction of the most substantial privacy legislation globally. However, I observed that while companies were rushing this time last year to achieve basic compliance, regulators were in a similar state of ‘lack of readiness’ – and actually facing much less pressure to be prepared. So despite the initial noise created around the regulation coming into force, the traction we’re seeing now is comparatively small.
“While fines have already been issued, these have been relatively minor in contrast to the 2018 threat of 4% of turnover. That said, I suspect the first real surge in non-compliance fines will trigger the next round of deeper implementation.
“There is a logical parallel here with the iterations of regulations in the payment industry – PCI DSS first, followed by PSD2. In some senses, the GDPR could be described as the second wave of privacy as all EU countries had local regulation prior to last year. But looking ahead, the GDPR is really only the beginning. I would suggest that all organisations should ensure good privacy practice because the real second wave of privacy – and therefore the real test – is yet to come.”
Janet de Guzman, senior director, industry marketing and compliance group, OpenText, considers moving from data owner to data custodian...
“With the first anniversary of Europe’s General Data Protection Regulation (GDPR) coming later this month, you may have assumed the state of data privacy would be incredibly different compared to this time last year. One year later, we have seen examples of high profile data breaches, but we have not yet seen European authorities use their new regulatory powers to impose the maximum penalties on multinational companies falling foul of the GDPR.
“However, while companies continue to collect massive amounts of data, this past year has marked a turning point in data privacy policy. Companies worldwide raced to make the May 2018 deadline. Regulators in Europe hired additional staff and began to test the impact of enforcing new policies. Countries beyond Europe, such as Nigeria and Japan, advanced regulations that mirror the GDPR. Furthermore, the D9 is leading the development of an open government that has data privacy at its core. One D9 member, Estonia, has even opened the world’s first data embassy to give the data it holds diplomatic status.
“More generally, there has been growing awareness around how organisations capture and process personal data. Yet, businesses from all sectors are still having difficulty complying with a critical parts of GDPR. Requirements like 72-hour notification of a breach and the fact that consumers can request copies of the data companies have about them have been especially challenging. To overcome these issues in the year ahead, organisations must transition from data owner to data custodian.
“This requires organisations to determine what personal data they have, where and how it’s stored and processed, who uses it, what it’s used for and why, and whether they have the right consent from a specific individual. Discovery consulting services can help to find and identify this information and an enterprise information management (EIM) platform is indispensable for compliant management of the data and processes associated with it. This visibility will be essential in the second year of GDPR enforcement if businesses are to achieve compliance status and avoid potentially huge penalties for regulatory infringement.”
Alberto Pan, CTO at Denodo, considers the role of data virtualisation in GDPR compliance...
“The European Data Protection Regulation is going in the right direction to guarantee the rights of users, but I think that the legislator underestimated the technical complexity of implementing the regulations. In large companies, personal data is usually distributed across multiple repositories, both locally and in the cloud, which poses an integration problem. Traditional data integration techniques are based on making even more copies of the data, which exacerbates the problem. Therefore, analysts such as Gartner and Forrester are recommending logical architectures based on data virtualization. These technologies provide a central point to access, integrate and govern the data without needing to replicate them, facilitating compliance with the requirements of the GDPR.”
Cindy Provin, CEO at nCipher Security, considers the impact of GDPR in terms of data protection...
“Since the GDPR came into force, we’ve seen a variety of breaches and fines occur, ranging from large, established organisations such as Google, Facebook, Uber and Marriott, to smaller organisations. With over 200,000 cases reported across Europe, the introduction of the GDPR has shown us that no organisation using the personal data of EU citizens can avoid compliance and accountability.
Before it took effect, much of the GDPR-related focus was placed on the potential fines and penalties associated with data breaches and a lack of compliance.
The reality is that this regulation – as well as future data protection laws – should be seen as a positive step in the battle to prevent data misuse. These regulations are not designed to discourage the use of data, but to provide consumers with reassurance that their personal information is in safe hands. They also encourage businesses to follow best practice when it comes to control and governance, two traits that cannot be overlooked in today’s modern cyber landscape.
The future of data protection means a commitment to accountability. If organisations wish to use data to gain a competitive edge, they must be prepared to take responsibility for its use and protection. It also means a commitment to transparency. Transparency in telling customers how their data is being collected and used and transparency when it comes to disclosing the scale and affected parties if a data breach does occur.
The GDPR marks a new era in the way that businesses think about data. And it’s about time. After all, we now live in a digital economy and data is any business’s most important asset, regardless of size or sector.”
Various industry experts offer their views and opinions as to what has been the impact of GDPR since its introduction a year ago.
Spencer Young, RVP EMEA at Imperva, says that GDPR has fundamentally changed the UK’s data protection landscape:
“The regulation has meant that regardless of the industry or location, any business that holds and processes personal data must prioritise data protection. But have organisations learnt anything in the past year when it comes to protecting their data?
“One year on, the majority of businesses in the UK are not taking GDPR seriously compared to those outside the EU. There is an obvious lack of awareness amongst organisations who don’t take into account the potential consequences of failing the requirements. According to Hiscox, in a survey of SME’s in the UK, nearly 40% of them did not know who the legislation even affects.
With large brands such as Google already tripping up on their GDPR journey, it comes as no surprise that other businesses are following suit. In March this year a medium-sized Polish company were fined 220,000 EUR, because they did not tell people that their data would be processed. GDPR is not just about data breaches.
The single fine levied against Google of 50M EUR accounted for nearly 90% of the 56M EUR in total fines imposed during the first 9 months of the legislation being enforced. That in itself tells a story.
“The UK government in particular are not being hot enough on compliance. To many organisations the pandemonium of GDPR was left behind in 2018 and seemingly replaced with the confusion of Brexit. What’s even more worrying is that the threat of hefty fines and damage to brand reputation is not acting as a strong enough deterrent.
“However, what organisations need to consider is that citizens are now becoming more conscious of the importance of protecting their data and have no problem with issuing complaints. With over 95,000 complaints coming from citizens within the first 9 months of the regulation coming into play, companies must immediately assess how to safeguard user information and protecting people’s privacy.
“The bottom line is that organisations must address GDPR compliance by implementing data-centric protection measures. There must be a close focus on securing data where it resides and everywhere it travels across the network to ensure no data is left unprotected. Security measures must focus on the data itself – endpoint and perimeter protection are important, but if the data itself is still at risk, the problem remains.
“Ultimately, with the numbers of GDPR-related complaints in Europe on the rise, companies need to act now to ensure their organisation doesn’t suffer in the long run.”
Gerald Beuchelt, CISO at LogMeIn, says that the regulation gives people more control over their personal data and requires data processing companies to exercise greater care and security when dealing with customer data and third parties:
In this context, passwords play a particularly important role – on the one hand, they ensure that access to data processing companies is secure so that only authorised users can access information. On the other, customers of these companies use passwords to access digital services. Password security must be effective in both areas. It is not uncommon for the same password to be used across multiple accounts or to be jotted down for everyone to see in the event that a password is forgotten. Password managers help to keep track of credentials and safely handle the authentication process. They generate secure passwords for users to access each of their accounts and store them in a central repository acting like a vault.
LastPass, the market leading password management solution, attaches great important to the safety of customer data:
“We have invested heavily in our own privacy to ensure that we are a trustworthy, secure and reliable company. Additionally, we have developed our products to ensure they comply with European privacy policies including GDPR. The security of LastPass is based on a zero-knowledge security design. This means that neither LastPass nor LogMeIn, as SaaS providers and hosts of their customers’ login details, have access to any user passwords. Encryption takes place exclusively at the device level before the data is synchronised with LastPass and stored securely. The LastPass vault can therefore only be decrypted by the users themselves with the master password, which is never shared with LastPass. Our motto is: If we cannot access customer data, neither can hackers. LastPass has also recently achieved several security compliance certifications including SOC 2 Type II, SOC 3 Type II examinations. Since we have invested in high data protection right from the start, the certification serves as a seal of quality to the outside world,” explains Gerald Beuchelt, CISO at LogMeIn, the brand behind LastPass.
David Francis, information security consultant, KCOM, says that GDPR has set a precedent in its first year:
“Data privacy is now a recognised cornerstone in customer relationships. Companies must be able to take good care of the personally identifiable information they hold, or risk suffering major reputational damage.
“Compliance depends on having a clear system of control over your IT infrastructure. If you’re holding data on your customers, do you know where it resides? Do you know which cloud infrastructure elements are hosting which data? Heavily siloed, sprawling IT landscapes can spell disaster if you can’t control them.
“Companies holding PII need to work with a partner to ensure they have clear policies in place for data control, including managing their cloud infrastructure in line with GDPR requirements. Following Google’s €50m fine from CNIL, the rubicon has definitely been crossed – those companies that fail to put the work into building compliant systems will pay the price in the end. Now’s the time to act – get your systems in line before your customers are negatively affected.”
Yuval Ben-Itzhak, CEO, Socialbakers, says that it is encouraging to learn that other regions are also looking to adopt GDPR as well:
"GDPR brought privacy to Board rooms and front pages and made everyone re-think about what and how they deal with digital data. GDPR made a paradigm shift for many businesses and marketers. Now innovation takes the lead to provide a GDPR-safe business reality where personalized experiences and new business can still be created without compromising privacy."
Mike Kiser, Global Strategist and Evangelist, SailPoint, believes that GDPR isn’t going away any time soon:
“Europe’s data privacy regulation shook up the privacy world by imposing some of the strongest consumer protection laws of the last 20 years and inspired even stricter laws in other parts of the world. GDPR created a single breach-notification regulation for the entire EU with the goal of protecting personal data of EU citizens.
“So, one year in, how are organisations fairing under GDPR? So far, there have been over 64,000 breach notifications, and regulators in 11 European countries have imposed $63 million (or £49 million) in fines. And these are just the first signs of a large wave to follow. With only 29% of EU organisations GDPR compliant, the breaches and fines will continue to happen. This reminds us that our identities comprise not just our attributes, but all personal data that relate to us.
“With one year under its belt, it doesn’t look like the GDPR is going anywhere anytime soon. By assessing risks with identity governance at the forefront, an organisation can create a roadmap to prioritise and remediate the most pressing regulatory gaps, and thus effectively control and secure the organisation’s data.”
Jasmit Sagoo, Senior Director, Northern Europe at Veritas Technologies, says that most organisationshave done the bare minimum when it comes to data handling and storage:
“Generally, they’ve aimed to remove risks in two ways. Firstly, by deleting old data that is no longer necessary. Secondly, by taking steps to reduce risk of litigation. This could be through consent forms on websites that ask customers to allow them to use their data, or through emails informing customers of the new GDPR rules and that they hold information about them. Rather than correcting underlying data management challenges, these organisations are simply doing just enough to avoid any legal issues.
“This relaxed approach to data protection is being driven by the lack of GDPR fines and reprimands for companies that have fallen foul of the regulation.
“However, there is one way that GDPR has worked: through improving transparency.
“High-profile data breaches have made consumers increasingly cautious about what data they share, where it’s being stored and who it is accessed by. Our research has found that poor data protection can have a dire commercial impact on companies - 56% of consumers would dump a business that fails to protect their data, and 47% would abandon their loyalty and turn to a competitor. In the last year, when organisations have had a breach, they have taken the correct measures to reach out to customers. This allows customers to update their passwords and protect themselves. In an era of fake news and corporate suspicion, this honest approach has truly benefited the consumer.
“However, transparency alone is not enough. Going forward, it’s likely that law firms will begin to monetise GDPR by encouraging consumers whose information has been misused to seek compensation, and those organisations that have taken shortcuts may wish they hadn’t. To prepare for this, businesses need to ensure they have full visibility and control of the data they hold. It’s critical that they make use of technology that can help them locate, protect and manage data, before it’s too late.”
Benjamin Ross, Director, Delphix, emphaises that security begins at the point of inception:
“25th May marks exactly one year since GDPR was fully implemented. The overarching data protection law concerns personal data and applies to all European Union (EU) residents as well as any company or entity that markets goods or services to EU residents.
Since implementation, the effects of the regulation have rippled across the region with around 65,000 data breach notifications to date.
Organisations have been fined a total of €56 million over the past 12 months and we have even seen giants like Google trip up in their compliance efforts and receive hefty fines in return.
If there is one key takeaway from the last year, it is that security begins at the point of inception.
In today’s digital-first business landscape, software maintenance, development and testing is a critical factor. But non-production development and testing environments, vital as they are, pose an enormous increase in the surface area of risk and are often the soft underbelly for GDPR compliance.
In order to minimise the risk of non-compliance, it is no longer enough to play defensive. Organisations must proactively protect personal and confidential data if they are to stay compliant and remain secure. Modern data masking solutions can help businesses easily achieve this by identifying confidential information, mask sensitive data values, and centrally manage data copies.
As we step foot into the second year with GDPR, it is important for organisations to understand that a foundational change in how data is accessed, managed, secured, and leveraged across the enterprise is key to staying compliant, dramatically reducing your company’s risk of a data breach and innovating at pace.”
David Smith, head of GDPR technology, SAS UK & Ireland, can’t help but notice the confusion surrounding GDPR:
“All that the first year of GDPR enforcement has really shown us is the depth of confusion over the regulation. It may be the topic that we’ve all heard enough about, but the simple fact is that widespread compliance simply hasn’t happened - although not necessarily through a lack of will.
"That’s not a reason to give up. GDPR compliance is one of the most important issues facing businesses today. Not because of fines or reputational damage - although those are big issues - but because GDPR is the first wave in a new era for data privacy. The business of the future is going to be built on the cultural foundations of GDPR, with the needs and security of the end-user at its heart, so it’s essential to align with that thinking now.
“To do that, companies have to understand the data they hold, where it originates, where it resides, where it travels and who uses it. They need to be able to decipher the digital labyrinth of their supply chain, as well as considering how the growing number of connected, consumer-facing things will impact the spread of personally identifiable information.
“To understand the ever-growing mass of customer data with which they’re faced, companies will need help. By implementing advanced analytics and AI-enabled systems, organisations can gain access to real-time, actionable insights about the state of their data landscape. That in turn will enable them to plan effective, targeted compliance programmes.
"Having a deep and constantly updated understanding of the data you hold also makes it easier to comply with privacy rights activations. Customers can request to be forgotten by your organisation or in some circumstances ask for an explanation of the decisions which concern them. Only through systematic analysis of the data can you root out every last occurrence of personal data that could otherwise come back to haunt you, and only through detailed analysis of your data flows and analytical models can you truly understand your decision making process. Our research found that the majority (56%) of consumers had plans to activate their rights in the first year of GDPR, so the risk is high.
“GDPR’s first anniversary may be an artificial milestone, but the importance of compliance is very real. Organisations need to equip themselves with advanced analytical tools to ensure they stay ahead of the curve."
Rene Hendrikse, EMEA MD, Mitek, says that GDPR is a business opportunity:
“Data privacy is unarguably the cornerstone of GDPR – but a year on, it’s no longer all about data. GDPR presents businesses with an opportunity to put what customers want first, beyond just data privacy and security.
“Take customer service. Long before GDPR, the best practice for responding to customers has been ‘in situ’ – through the same channel they reached out on – giving birth to ‘conversational commerce’. Research last year found that a majority of consumers prefer to use a messaging app to communicate with companies, and that 79% of millennials would rather use any method other than the phone for customer service. What’s more, there’s a sense of urgency – 54% of consumers want to hear back from a company they’ve messaged within one hour, and only 1% think it’s acceptable to wait more than a day.
“However, in the post GDPR-world, responding to a customer fast, in situ – and via a compromised platform (of which we’ve seen quite a few, including WhatsApp) – seems practically unthinkable. Brand risks associated with using a compromised app as part of the notification process when things go wrong – especially when there is no alternative means of communication – can be a terrifying predicament. This is why serious investment in the mobile channel for customer service is a must. As customers are increasingly reaching out to businesses on mobile, in a variety of ways, it’s time for businesses to adapt.
“The need for speed and ever-evolving customer expectations, combined with GDPR, have led to a complete overhaul of business models when it comes to data privacy. This has particularly impacted marketing, sales and customer service departments. For example, as the sharing economy continues to grow, companies are turning to technology to onboard good customers securely and in compliance with GDPR, as well as other EU regulations designed to protect against nefarious activity. For example, technology such as identity verification makes this process simple. A consumer can take a photo of an ID document, and AI is used to verify its authenticity. Then, biometric face comparison is used as a second layer of authentication to compare the ID document image with a selfie of the customer. All these advancements in tech mean businesses can attain improved data privacy and better, faster customer service.
“Investing in mobile technologies is a no-brainer a year on from GDPR. With data privacy having come on leaps and bounds in just a year, the next round of GDPR compliance will be simple: giving customers what they want.”A handful of technology experts have come together to review the past year of GDPR, offer their opinions on the effectiveness of the regulation, and discuss what changes they would like to see in the next year.
It has been one year since the General Data Protection Regulation (GDPR) came into force. Billed as a complete overhaul and replacement of the Data Protection Act of 1998, it came at a time when technological advances and new ways of using personal data meant that the law needed to catch up and protect citizens’ data. The way organisations ultimately approach data privacy has been completely reshaped, and the introduction the role of the Data Protection Officer (DPO) has helped enforce personal data privacy, particularly if organisations are processing high volumes of personal data.
GDPR’s initial impact
Since the implementation of the infamous GDPR last May – a date that’s likely engrained on every IT team’s mind for all eternity – meeting data protection regulations has never been so important. Yet despite the day coming and going without a bang, we still see many companies living in a compliance no man’s land – not fully confident in their compliance, but also aware of the regulation and the implications of rogue data.
Steve Blow, Tech Evangelist at Zerto explains: “Although there have been a significantly less amount of fines than we all predicted, no business should become lax about compliance. My advice to those still in a grey area is to make sure their business is IT resilient by building an overall, comprehensive compliance program.
“A key component of this program should be backup. Backup that is continuously protecting data, making it easily searchable for long periods of time and ultimately, also, preventing lasting damage from any data breach you have to report. Peace of mind is a top priority for all IT teams and GDPR has definitely lead to some sleepless nights, but with an IT resilience solution that has your back, you can rest easy.”
Neil Barton, CTO at WhereScape commented: “Despite the warnings of high potential GDPR fines for companies in violation of the law, it was never clear how serious the repercussions would be. Since the GDPR’s implementation, authorities have made an example of Internet giants. These high-profile fines are meant to serve as a warning to all of us. It’s a huge task to get your data house in order, but automation can lessen the burden. Data infrastructure automation software can help companies be ready for compliance by ensuring all data is easily identifiable, explainable and ready for extraction if needed. Using automation to easily discover data areas of concern, tag them and track data lineage throughout your environment provides organisations with greater visibility and a faster ability to act. In the event of an audit or a request to remove an individual’s data, automation software can provide the ready capabilities needed.”
Steve Armstrong, Regional Director, UK & Ireland at Bitglass agrees, stating that: “Amid much fanfare GDPR came marching over the horizon with bundles of confusion, poor interpretation and the usual “silver bullets” from the technology world. Outside of many technology companies extolling “the” solution to make organisations GDPR compliant (which frankly is a pure figment of their marketing team’s imaginations) there have been some interesting consequences of GDPR.
“From a technology perspective, organisations are being far more diligent on contracting terms and getting a clear understanding how their data is being handled by their tech partners and ultimately what jurisdiction the data is being processed in.
“The C-suite has now much more responsibility for customer data protection. This likely caught many organisations off guard; but on the plus side it has broadened the conversation about data security from something the guys in the basement did, to a board level addressable issue.”
Enforcing the regulation
Naaman Hart, Cloud Services Security Architect at Digital Guardian fairly believes that the key to every new regulation is the punishment and their ability to enforce it. “Ultimately without the plausible threat of punishment the regulations will fail to impact wide sweeping change,” he comments. He continues: “While awareness is definitely up and companies have taken steps to address the criteria of the regulation, or minimise their risks from it, I don’t see evidence in recent breaches that the regulation is being followed.
“We’re at a turning point of sorts. If test cases start to emerge with significant fines then companies will start to take more notice and we should see some positive impact from the regulation.
“As we enter the second year of the GDPR we can but hope that cases and fines continue to paint a picture that companies cannot avoid punishment for poor data handling. If the risk outweighs the reward then we should see a societal shift towards better privacy which benefits everyone.”
Samantha Humphries, Senior Product Marketing Manager at Exabeam feels that the GDPR has created a lot of noise: “For EU data subjects, our web experience has arguably taken a turn for the worse with some sites blocking all access to EU IP addresses and many more opting to bombard us with multiple questions before we can get anywhere near their content (although at least the barrage of emails requesting us to re-subscribe has died down). And it has definitely kept its parents busy: in the first nine months, over 200,000 cases were logged with supervisory authorities, of which ~65,000 were related to data breaches.
“With the GDPR still very much in its infancy, many organisations are still getting to grips with exactly how to meet its requirements. The fundamentals remain true: know what personal data you have, know why you have it, limit access to a need-to-know basis, keep it safe, only keep it as long as you need it, and be transparent about what you’re going to do with it. The devil is in the detail, so keeping a close watch on developments from the EDPB will help provide clarity as the regulation continues to mature.”
Turning to technology
The introduction of the GDPR has impacted an unprecedented number of business processes, and security and risk teams are struggling to meet all these simultaneous demands.
According to Hubert Da Costa, SVP and GM, EMEA at Cybera, there is a more positive outcome from this: “On a more positive note, it has also brought an opportunity for companies to leverage new or additional technology solutions. Take the network edge as an example. This is one of the primary areas where personal data is at risk. In the past 12 months we’ve seen many organisations using GDPR as an opportunity to replace traditional VPN technology at the edge with SD-WAN technology. Due to its multiple data security capabilities, and levels of visibility and auditability, SD-WAN enables organisations to better meet GDPR guidelines. With Gartner predicting that before the end of 2021, more than one billion euros in sanctions for GDPR non-compliance will have been issued, we’ll continue to see security and risk teams under pressure to protect user data and privacy.”
Looking across the pond
The GDPR has further impacted other parts of the world. Last year, the California Consumer Privacy Act (CCPA) was signed into law, which aims to provide consumers with specific rights over their personal data held by companies. These rights are very similar to those given to EU-based individuals by GDPR one year ago. The CCPA, set for Jan. 1, 2020, is the first of its kind in the U.S., and while good for consumers, affected companies will have to make a significant effort to implement the cybersecurity requirements. Plus, it will add yet another variance in the patchwork of divergent US data protection laws that companies already struggle to reconcile.
Wendy Foote, Senior Contracts Manager at WhiteHat Security, considers how the GDPR has impacted laws around the globe. “If GDPR can be implemented to protect all of the EU, could the CCPA be indicative of the potential for a cohesive US federal privacy law?” she questions. “This idea has strong bipartisan congressional support, and several large companies have come out in favor of it. There are draft bills in circulation, and with a new class of representatives recently sworn into Congress and the CCPA effectively putting a deadline on the debate, there may finally be a national resolution to the US consumer data privacy problem. However, the likelihood of it passing in 2019 is slim.
“A single privacy framework must include flexibility and scalability to accommodate differences in size, complexity, and data needs of companies that will be subject to the law. It will take several months of negotiation to agree on the approach. But we are excited to see what the future brings for data privacy in our country and have GDPR to look to as a strong example.”
This regulation has gone a long way to improve data privacy and citizens’ rights. But there is still a long way to go. Here’s to another year of GDPR and the positive changes it will help make.
IDC estimates worldwide data volume is set to rise by 61% between 2018 and 2025 – eventually reaching 175 zettabytes – with much of this generated by businesses. So how can this be harnessed to optimise business processes, improve day-to-day operations and inform decision-making? The answer lies with humanised machine learning platforms, says Mind Foundry Director of Research Nathan Korda, which are making advanced machine learning capabilities accessible to business problem owners, enabling the rise of the ‘citizen data scientist’.
Too much data, too little time
Many businesses today are struggling to analyse and extract full value from the wealth of data being generated and gathered daily. The challenge that lies with business problem owners – whether this is a C-level executive, analyst or even operations manager – is how to effectively understand their data to drive further business value and optimise processes.
They may have spreadsheets full of data and use simple data models to extract limited value, but how can they take this further? The answer lies with greater accessibility of machine learning through user-centric platforms. For the first time, this enables business problem owners – those with intimate knowledge of specific problems and their impact on operations – to connect advanced machine learning capabilities to business value.
The benefits are available to all
Machine learning has traditionally been viewed as requiring extensive resources, time and technical expertise, which often includes hiring data scientists – a highly specialised field where talent demand currently outstrips supply. Beyond this, data scientists are often too separated from a business problem to contextualise it and understand the full impact it has on operations.
Enter the citizen data scientists – employees not operating in dedicated data science or analytics roles, who can use a humanised machine learning platform to explore their data and easily deploy models to unlock the value it holds. Thanks to user-centric platforms, current employees can enjoy access to machine learning technology without the need for specialist training. This is a significant milestone in empowering data owners to quickly master their own data and complete operations at scale, without significant investment or expertise. At the company level, this puts advanced machine learning solutions into the hands of small and mid-sized organisations and their employees, who may be lacking data science expertise. But the increased accessibility of machine learning also generates fresh opportunities for data scientists, freeing up their time to get closer to business problems and focus their skill set on innovation for digital transformation projects.
New business capabilities – at speed and scale
A machine learning platform provides citizen data scientists with greater accessibility to the capabilities required to quickly prepare and visualise data, and subsequently build, deploy and manage a suitable model. Whether this involves suggesting actions to clean and correctly format data or recommending the most suitable model for a data set, a humanised platform is designed to guide users through the process from start to finish.
A core aspect of this approach is reducing the volume of mundane data preparation tasks. Think of business processes that are repetitive and involve analysing data in a similar way on a routine basis, such as budget forecasting. Instead of tying up senior management resources for several weeks to finalise budgets based on expected business outcomes, managers can use an intuitive machine learning platform to quickly identify and set up a model capable of being reused to revise budgets annually – dramatically cutting the time investment in this process going forward.
Alternatively, take an advanced manufacturing company that develops and produces precision components. They may have machinery experts with decades of industry experience and a deep understanding of the data produced by equipment sensors – but they can’t identify patterns and areas for optimisation without a dedicated data science team. With humanised machine learning platforms, these experts can input, cleanse and visualise data in minutes, then select an appropriate data model to uncover previously unseen insights.
Man meets machine: complementary capabilities
Machine learning platforms are intended to amplify existing employee skill sets. They remove a large amount of the time and resources traditionally invested into applying machine learning to business data, yet ownership and control of the process still lies with the user. This is key to successful use of machine learning technology.
Machine learning applications are excellent for risk assessment and management, and making data-driven judgement calls, but lack the intuition and creativity required to contextualise and problem-solve for human affairs. This is where humanised machine learning platforms draw the line between ‘human’ tasks and ‘computer’ tasks. They take on the labour-intensive, repetitive tasks such as data cleaning, data-driven model discovery, and model validation, and empower problem owners to focus their time and resources more directly on the business problem at hand.
Ultimately, the computer will always have to collaborate with a human when applying machine learning. To ensure project success, machine learning needs to form part of a human team, augmenting human skills, intelligence and capabilities. Humans have the unique capability to contextualise data and associated errors. Take a simple example where error codes are present in a large data set. A machine learning platform will struggle to contextualise this, but a human who is close to the business process can quickly provide an explanation, such as sensors being out of range.
Beyond the immediate benefits, machine learning platforms solve the issue of legacy once a citizen data scientist leaves the company. These employees can develop machine learning solutions to solve specific business problems, secure in the knowledge these accomplishments will still be operational, intuitive and reusable by colleagues once they have moved on.
Machine learning is now viable for every business
Machine learning is set to become increasingly common among businesses of all sizes as they push to optimise their daily operations. Don’t forget, business problem owners will always have a unique and intimate knowledge of a specific problem and its relevance to existing business priorities. For the first time, they can directly identify and enhance the value of their data by quickly harnessing machine intelligence at scale.
Applying machine learning to data no longer needs to be an arduous, resource-consuming project spanning several months. The rise of citizen data scientists is bringing significant opportunities for smaller and mid-sized businesses to quickly harness advanced machine learning capabilities to unlock greater insights and business value from their data.
Nathan Korda is Director of Research at University of Oxford machine learning spin-out Mind Foundry.
Paul Johnson, UK Data Centre Segment Leader at ABB, answers a few questions about how critical infrastructure at data centres such as power distribution and UPS systems can be scaled to meet the needs of server operations, and how data centre operators can use this as a ‘pay as you grow’ approach.
Why is scalability important?
In a recent survey, nearly two thirds (64%) of data centre experts identified scalability as their number one challenge at work. This makes sense when you consider that, without being able to scale, they have the challenge of meeting growing demand at the same time as maintaining uptime and operational efficiency. As demand on servers varies, so does power load and deployment of UPS systems.
Until recently, operators have considered power distribution as a fixed cost, much like the fabric of the data centre buildings and the grid connections that feed them. However, the ability to scale will enable them to meet demand as well as finding efficiencies at times when demand is shrinking.
As a result, data centre operators need a power infrastructure that is flexible and scalable, This will help them to better meet the needs of their servers – and ultimately their customers. This is where Elastic Critical Infrastructure (ECI) can help.
What is Elastic Critical Infrastructure (ECI)?
The concept of ECI is based on the use of many small standard modular design blocks that can be combined to offer the same performance as a large-scale system. This enables data centre operators to scale up or down to meet demand. The approach can be used for medium voltage switchgear, UPS systems and also low voltage switchgear, and is configured in in such a way that data centre operators can use it via a 'pay as you grow' approach.
The key to ECI is its use of standard products that are readily available on short lead times and can be deployed at scale. On top of that are the smart control and communication systems that help to manage these systems and make significant differences in levels of efficiency.
How can the approach be deployed at scale?
While a basic 2N system architecture provides certainty that the infrastructure will supply loads, the drawback of it is that an operator can never achieve more than 50 percent utilisation from their infrastructure. That is because of the arrangement in 2N system architecture, where two parallel systems are each capable of meeting the data centre’s entire load.
By comparison, with numerous small modules used with ECI, operators can combine modules in different configurations to meet the load in a flexible way.
This is particularly attractive when delivering UPS systems, which are relatively costly per kW, particular by comparison to other types of electrical infrastructure, such as medium and low voltage switchgear and transformers.
In the example of a 1.5 MW server load, four 500 kW UPS modules can more than cover the load and an outage on one module will not affect overall power availability as the remaining three modules can work together to meet the entire demand.
With ECI, operators can install different architectures and move between them as the facility grows. Depending on the size of the block that is used, a facility could start out with two 1MW power streams in a 2N architecture. However, as it increases in size and capacity, an additional power stream can be included to reconfigure the system to be a distributed redundant, 3 to make 2 architecture.
As a general rule, as an operator adds more modules, the flexibility grows. More modules provide operators with more options to create load groups. An additional benefit is that the impact of any single module decreases. This means that operators can optimise the system by reducing overall capacity, while increasing the utilisation factor (UF) of each module and of the system as a whole.
How does ECI reduce stranded capacity in a UPS?
In traditional monolithic UPS systems, capacity can become stranded. They contain a small number of large UPS blocks which also contain many single points of failure. As a result, the UPS is sized to suit the final load – and if that load draws only some of the capacity, then the remaining capacity is stranded. A further disadvantage is if any one module fails, it has a larger impact the availability of the whole UPS system.
By comparison, with ECI, modular UPS are used, each of which has its own static switch and control unit. The result is a greater level of flexibility. The data centre management system can group power modules to feed multiple loads and make full use of the capacity. In addition, the Decentralised Parallel Architecture (DPA) ensures higher availability of backup power
For example, ABB’s DPA modular UPS has its own dedicated control logic, static bypass, user interface and switchgear that enables each power module to act autonomously as a complete UPS.
What is the benefit for transformers and circuit breakers?
A fault on a single large transformer unit can lead to high fault currents – and this has a knock-on effect on the sizing and fault-current capabilities of circuit breakers. However, by dividing the load between multiple smaller transformers, ECI helps operators safeguard their installations from high fault current levels.
These lower fault levels also limit the incident energy in the event of an arc flash. And a further benefit is that operators can downsize their protection devices such as air circuit breakers and moulded case circuit breakers, which can also lead to significant cost savings.
What are the implications for protection, control and communication?
The big implication of using more modules of a smaller size is that operators need to implement more protection and control when adopting ECI. And these schemes need to be capable of reacting fast, therefore need to be based on digital technologies, such as digital metering of current and voltage, as well as monitoring of temperature and component wear. In addition, they need smart controllers and software that is able to control the flow and optimise the flow of power to the servers.
The result is accurate and timely data for data centre infrastructure management (DCIM) systems to optimise operations and make decisions based on availability, status and condition of critical power.
ECI is scalable, flexible and easy to manage. For operators who have plans to scale, the foundations of ECI make it the right choice.
The growth of the IoT market is impressive, with analysts predicting that the number of connected devices around the world is on track to grow to almost 31 billion by 2025. The increase in the number of connected assets has enabled organisations to amass a significant amount of data which, when used effectively, can add immense value to an organisation by enabling crucial insight in terms of business strategy and generating efficiencies. Additionally, the projected impact of AI and 5G on all sectors will help to unlock additional scale, security and interconnections between all parts of the IoT landscape, amplifying the positive impacts of the technology for all stakeholders. So how will these key elements influence and impact IoT as the technology continues to evolve at breakneck speed?
Nick Sacke, Head of IoT and Product, Comms365, explains.
Big Data
When it comes to big data, there is a business opportunity to acquire, manage and sell IoT datasets, which provide key insights and potential influence over the lives of billions. Larger organisations such as Bosch and GE have already been amassing data on a vast scale, and with the deployment of new sensor networks, they are generating huge data sets. However, what is lacking is the structure around how to exploit this business opportunity – how to share and profit from it.
The only way that the market will grow, as anticipated, is in correlation with the further regulation of data in terms of how it can be accessed and the security standards, as well as consolidation within the industry over who actually holds the data. With clear regulations in place, any confusion around the business potential of big data can be overcome.
One industry where this is already happening is insurance. With IoT, data can be collected and managed at an impressive rate and scaled to gain valuable information such as customer insights, real-time risk analysis and fraud detection. This accurate and efficient risk assessment can inspire the development of more flexible and bespoke products, sparking innovation throughout the insurance value chain. It is through tangible use cases such as these that will also encourage other industries to follow.
Artificial Intelligence (AI)
AI is already well embedded in our culture and is achieving great success in driving efficiencies through numerous industries. For example, in the medical world, AI is able to learn from data sets and make projections about cancer diagnosis by looking at scans, with the same accuracy as an industry expert but at a faster rate. This is a huge step forward for technology becoming more entrenched in the applications and processes of cities, corporations and our day to day lives. The accuracy and capabilities of AI are already high and this will only grow further as more decisions and reactions are automated by machines. Combined with IoT technology, AI can quickly determine insights and detect anomalies in data, offering fast and accurate predictions to improve operational efficiencies.
AI can also have a significantly positive impact when it comes to healthcare. With the population growing considerably each year, more people require services and combined with an ageing population, social care services are increasingly stretched. AI can operate as a system in the background to support these services, by checking heart rates and other metrics that could indicate a potential hospitalisation before it turns to a crisis, thus taking a proactive approach and reducing the load on reactive emergency care.
Another industry where AI will have a strong impact is in agriculture. Our planet is undergoing significant changes in terms of weather and climate challenges, but through the use of AI, our outlook could be vastly improved – we will be able to react faster and proactively intervene before issues occur. For example, there is a problem with drought in many locations and coupled with growing population numbers, food production on an aggressive scale is needed and without a solution in place this challenge could soon become a much more serious issue. Machine learning can determine how higher yields can be generated and which geographical areas shouldn’t be used for planting, helping farmers get more from their land in a more sustainable way and taking the guesswork out of farming.
One area where AI will start to play a more dominant role is in security. The UK is one of the most CCTV intensive countries in the world but in reality, a computer can read video much faster than a human. Through machine learning, software can now be programmed to identify a particular person or vehicle, and also flag unusual activity based on predetermined factors, to highlight activity that might require further investigation. Therefore, a fundamental shift is anticipated whereby some of the more process-intensive activity can be actioned by AI, freeing up more meaningful tasks to be completed with human interaction.
5G
Although the 5G standards are yet to be finalised, it is expected that 5G will be 10 times faster than 4G at an estimated 10 Gbps, as well as having ultra-low latency. These potential speeds can barely be compared to the current ‘super fast’ fibre broadband available in the UK, which is up to 200 Mbps.
With commercial 5G roll out expected in 2020, we anticipate that pockets of campus-like 5G networks will spring up in terms of research facilities and localised networks, before there is a national one. 5G will be delivered by carriers installing ‘small cells’ that will allow a 5G infrastructure to be deployed. A number of carriers will use it for capacity augmentation and extending 4G capability, so 5G will not only provide higher bandwidth, it also allows for the network to be partitioned in several ways. For example, IoT over 5G will have its own dedicated pathway, as will voice.
With the promise of ultra-low latency, faster speeds and reliability, 5G will enable IoT innovation to be extended further to multiple use cases across numerous industries. In particular, low latency improves response times, so when it comes to manufacturing machinery or autonomous vehicles, it will ensure near-instant reactions to safety issues. However, to ensure the 5G IoT ecosystem functions effectively, cities and businesses will need to have a strategy in place to support 5G networking to ensure the full benefits of the technology can be realised.
We asked a range of industry professionals how collaboration in the IT space needs to develop. This is in terms of both how end user organisations need to get internal departments to work together to ensure faster and smarter new product and services development; and also how it is increasingly important for vendors to work together to provide hyperconvergence/integrated solutions which are ready to work 'out of the box'. Not to mention the importance of the supply chain working closely with the customer. Part 1.
Christophe Reyes, UCaaS Managing Director at Arkadin, comments:
When it comes to driving maximum success in today’s business landscape, the need for better collaboration no longer exists solely for technology providers and their customers. Achieving the levels of staff engagement and productivity modern businesses required to harness growth at scale now relies heavily on deploying technology integrations which make these efforts simple, without risk and stress-free. In many cases, this now means implementing a range of tools and solutions into a business’ existing infrastructure in order to enhance collaboration without the expense of completely overhauling the systems already in place.
This growing demand for a more converged approach to technology from the end user ultimately requires greater collaboration between the suppliers themselves and this is now being recognised by many providers. Take the communications sector for example - which by its very nature is expected to provide effective ways for people to connect and interact. In an effort the join up solutions and deliver a more unified service, we are now seeing an uptick in vendors finding new ways to combine their latest cloud-based communication tools, leveraging the strengths of each solution to drive innovation with smarter interactions and meet the demands of the modern digital workplace for the end user.
As we continue advancing through a digital shift in the way people and businesses collaborate, the presence of traditional business models is rapidly depreciating and organisations now operate in many more ways than one. It is crucial that vendors in all industries recognise that every company is different work ever more closely to not only deliver the integrated solutions that modern businesses both want and expect, but to also increase the footprint of their own technologies with a highly competitive digital era.
Collaboration is the future, says Daniel Creigh, Head of UK & Ireland, Zoom Video Communication:
Amid uncertainty, there’s one certainty: business is more global than it has ever been, and trading with partners, customers and suppliers across the world is set to characterise business in the future. Connectivity lies at the heart of collaboration and is central to knowledge, idea and information sharing ,which are essential for businesses to become more productive and efficient as they make their digital transformation journeys. There’s no room for isolation in the digital economy. Even on a local level, we need to think about how we maintain communication with peers and colleagues, given that events – from extreme weather conditions to the unpredictable nature of public transport system – conspire to delay us from doing our best work, putting us at a disadvantage to our connected colleagues.
There’s no doubt that the UK economy is still in question, with more uncertainty ahead while it tries to understand its future post Brexit. Many companies are responding to the current dynamic by moving toward agile workplaces to cut operational costs, thus moving their staff to smaller, more cost-effective premises while encouraging hot-desking and remote working. As such, how we define the workspace is changing and today it’s not unusual to see work conducted in environments as wide ranging as huddle spaces to coffee shops; what matters is that the work gets done, rather than where it gets done.
Many companies are increasingly deploying video communications as a way of collaborating, with a view to introducing it to every single meeting room, desk and employee. This isn’t just in response to what’s happening right now, but it’s also in anticipation of future trends, one of which is the rise of Gen Z in the workplace. Immersed in tech from day one and tech-savvy, their willingness to embrace technology and video specifically eclipses that of previous generations who took a perhaps more guarded view of change. In those situations, video facilitates not just the important act of collaboration, but communicating in a way that helps us to build better relationships at work with key stakeholders and fostering personal connections.
Directly connected to customer satisfaction, is how good employees feel at work. It’s imperative therefore to make sure you do everything you can to connect on an emotional level to build productive and lasting professional relationships. The knock on effects internally have the capacity to bolster and – critically – motivate your entire team. As ever, this initiative needs to be driven from the top. When the C-Level or other members of the leadership team use collaboration tools and have two-way conversations with as many employees as they can, employees are more likely to believe that they are part of a broader team in a business with clear goals and a clear path as to achievement. The knock-on benefits further motivate staff who, generally, will provide a better and happier service to their customers; at a time when many companies are committed to their digital transformation path, engaging positively with customers is even more imperative.
All businesses are facing uncertain political and economic times. But success through trying times lies in unity, a scenario that can be embraced and achieved through technology that keeps everyone engaged, connected and collaborating, wherever they are.
Northamptonshire Council, Kier and mySociety (via their FixMyStreet Pro product) have collaborated to create an integrated service for reporting street problems:
Richard Middleton, country manager for UK&I at Lifesize, comments:
“Collaboration between staff is key to business success but implementing a collaborative workplace culture may not be as simple as it seems. Businesses face two major problems, the rise of remote working and the lack of integration between communication tools.
“Our workplace is truly becoming borderless, with staff and communication between employees no longer confined by office walls. This is particularly apparent in the IT space, with staff increasingly demanding to work from home and businesses having to look overseas to fill the digital skills gap. In this global business environment, the value of secure, reliable and effortless communication and collaboration are no longer in question.
“Using collaboration tools effectively can bring immediate benefits to a company, enhancing internal and external communications and powering flexibility. According to the Institute of Leadership and Management, 84 per cent of managers who have implemented flexible working schedules in the UK have seen improvements in productivity, commitment and retention of staff.
“But many businesses face the headache of juggling multiple tools to collaborate. To truly flourish, companies therefore must pursue a simpler, more unified experience for scheduling their meetings, managing collaboration and more. The onus is also on companies like Lifesize to create a platform which can seamlessly integrate into the existing tools used by a business on a daily basis, such as Microsoft Teams, Slack and Microsoft Office 365.
“When effectively collaborating across the global business environment, IT companies can reap the rewards. For example, we worked with global tech company WP Engine to help them bridge the geographical divide between their offices. Despite being spread across three continents, a unified collaboration experience enables co-workers to instantly connect with one another and managers can effectively manage remote employees in one-on-one video calls. The face-to-face interaction helps employees have engaging conversations and personal connections with managers and team members thousands of miles away. As Sarah Jones, Sr HR Business Partner at WP Engine puts it, ‘by being able to jump on Lifesize, it just feels as though we are one team, one office working together seamlessly.’”
Machine learning and artificial intelligence are two of the hottest topics in technology. However, few organisations are using either concept in real world scenarios. For many, the biggest challenge is adapting tasks, often human centred, to take advantage of the new embryonic technology. Instead, many are focused on providing intelligent assistance though systems that automate mundane tasks and provide suggestions that help improve efficiency while still leaving the human as the central intelligence.
By Tony Lucas, Director of Product, Smartsheet.
Artificial Intelligence (AI) is still considered by many researchers as an aspirational term to reference technology that doesn’t quite exist other than in Hollywood movies. However, in reality the technology behind the concept can perform useful tasks. That technology is machine learning (ML), which can be viewed as just a new way to program computers to perform tasks that we don’t quite understand (yet).
One interesting way to look at machine learning is to view it as writing a program based on data rather than explicit coding. This is revolutionary compared to standard architectures and can be incredibly powerful when applied to problems where we have lots of the right kind of data and where we don’t entirely understand how to solve a problem explicitly. Through the process of machine learning, we can mathematically solve a problem or do something that we can’t necessarily solve with programming code.
One of the big strands is predictive; with machines able to sift through huge amounts of data to spot patterns and predict what will happen in a range of circumstances. From regulating traffic flows in cities to predicting when a machine will breakdown; these systems are getting more accurate each year.
Straight and narrow
Yet the most common interactions with humans are intelligent assistants that range from mechanical systems such as cruise control to applications like Alexa, Google Home and web-based bots that can seemingly have a conversation, answer questions and carry out tasks on request.
In the simplest form, systems such as cruise control offer a feedback loop that regulate a simple variable such as acceleration and braking to maintain a constant. More advanced systems may enable cameras to look at road signs and lane guidance to make steering decisions. However, truly autonomous driving needs to process an order of magnitude of more inputs along with experience based decisions that are built through driving – this is the stage we are at now with autonomous vehicle tests collecting data from limited trials to “learn” all these conditions and codify these factors into working models.
Systems that can act on human textual and spoken commands or conversations are potentially more advanced as they have less life-threatening dangers if the technology goes wrong. Many of the first generation of systems are closer to search engines than intelligent systems but as they are exposed to millions and ultimately billions of queries and follow up questions, plus billions of lines of conversations between individuals and groups, to start to understand the syntax and context of conversations is a huge challenge.
At a more granular level, machine learning incorporates several different approaches which start with Supervised Machine Learning Algorithms: These are most like the technique detailed above. They involve the use of a training dataset with inputs and outputs for which a machine creates an inferred predictive function that allows it to turn inputs into desired outputs after enough training. This type of algorithm also enables a machine to learn from mistakes — that is, when an input isn’t converted into the desired output.
Unsupervised Machine Learning Algorithms are similar but use training data that is not classified or labelled. The idea here is not to generate outputs, but to describe hidden structures that might be present in unlabelled data, so the machine’s goal is to infer a function that can describe these hidden structures.
Reinforcement Machine Learning Algorithms use a behaviour-reward learning method by which a machine learns which “behaviours” will earn it rewards — including delayed rewards — through trial and error. The goal is for a machine to be able to determine the most suitable behaviour in a given context.
Intelligence in action
To put these in context; a recent project that one of our technical teams worked on for a large professional services organisation integrated a machine learning “bot” with a set of workers that communicate using Slack, a team communication and messaging platform. The bot sits in the background analysing the flow of conversation and chimes in occasionally with (hopefully) useful inputs. For example, if somebody asks - “Hey, can somebody send me a copy of the last worldwide sales report?” - if there is no response, the bot will autonomously attempt to match the request against its understanding of how similar requests may have been handled alongside its library of stored reports. This is a simplistic description as the system also needs to determine phraseology and syntax such as “last” as in chronological or based on year – which may well prompt it to respond with a question such as, “Would you like the sales report for 2018? Or the sales report from Q1 of 2019?”, before handling the response. This questioning phase is vital in helping these systems to learn user requirements and context.
In another project, an automated, intelligent assistant is learning to deal with common customer support questions. Common requests such as “Where can I order spares?” can be answered much faster by a machine than a human operator. However, when the system sees a request that it can’t find a match for, it may escalate the request to a human operator and then analyse the responses and customer interaction using its learning algorithms to ultimately add to its own set of knowledge. As it deals with more questions successfully, the number of exceptions tends to decline, requiring less human interaction.
AI meets work execution
This process of data gathering, analysis, trial and error is at the heart of many ML/AI initiatives and as with both examples above, these systems are increasingly baked into work execution platforms that essentially help define a complex process as a series of discreet steps carried out in series or parallel. Work execution platforms are used in a growing number of industries to streamline work processes with automated actions. Managing complex projects is a common use case but systems are also used for prioritising and approving budget requests at large enterprises such as Office Depot and Cisco and even building coffee shops or planning the filming schedule for large Hollywood studios.
The next generation of these systems are now adding AI, aiming to integrate natural language user experiences that merge business process automation with popular messaging platforms and business systems like Slack, Workplace by Facebook, Salesforce, Google, Vonage and Hubspot.
One of the biggest advantages is that chat overcomes some of the complexities of traditional programming languages to allow users to build their own integrations and capabilities on top of existing workflows.
For example, field workers can more easily submit safety issues including descriptions, photos and location data through a chatbot interface into a work execution platform where they will be logged and then acted upon through either automated or human interaction, based on severity, type, location, or other factors.
In onboarding situations, the mix of work execution platforms and automated assistants can help new employees to automatically receive information, be asked for feedback, and taken through company onboarding processes, with all relevant data sent to HR and other teams as appropriate.
The same systems are increasingly being integrated with voice-based systems such as Amazon Alexa or Google Home to allow voice activation to get an update from a work execution platform on which projects are on track and which are at risk.
Privacy and exploration
The number of potential use cases for AI is growing rapidly and as more data is fed into these systems, some people may have concerns over privacy. The best advice for enterprises is to be as transparent as possible with employees, partners and customers as to what data is being collected, how and why it is being used. In public facing systems, there should be a method of opting out of data collection along with a mechanism for complying with GDPR data requests if they should arise.
In terms of gaining the benefits, the simple rule of thumb is to start with a bite sized problem and look at how it could be solved without even considering if ML or AI is part of any solution. So, for example, take dealing with customer service queries via a web form - even creating a simple branching tree decision structure to categorise types of queries can help to streamline routing questions to the right teams. Then starting a small ML program to look at one of these streams to begin the learning process can make the process more manageable while impacting only a small subset of customers.
There are a growing number of off-the-shelf tools that can help build work execution platforms and start simple AI applications, and by avoiding sensitive areas such as financial or health records, trial and error has little risk. The long term aim for many will be to gain through automation and those that are prepared to start now will be in a better place for the future.
As we move beyond artificial intelligence, we will begin to talk about augmented intelligence. With this, we will also see an evolution of the role of people in the workplace as augmented intelligence begins to assist individuals in how they do their jobs.
By Jon Payne, Manager - Sales Engineering, InterSystems.
Augmented intelligence is resulting in adaptive systems and intelligent tools which can be applied or used to facilitate the jobs carried out by people. While augmented intelligence, like artificial intelligence before it, may take some tasks away from people, we look at what this will mean for the role of human beings across different industries and the potential value of augmented intelligence.
Software development
Software development is one of the key industries that is seeing the impact of augmented intelligence. As software tools become more intelligent and get more capabilities built into them, we are we are seeing fewer people writing low-level code that creates the fundamental building blocks for IT systems and more people are assembling pre-built components in ways to meet novel demand. What we are seeing is the way in which these pre-built components operate provides more intelligent capabilities, and consequently make it easier and faster for people to build systems that incorporate those capabilities. Rather than having a team of data scientists on tap to build augmented intelligence models, for example, it’s becoming easier to have components that you incorporate into a data workflow which will automatically construct models to perform certain functions, such as the de-identification of data.
Financial services
The advent of augmented intelligence is happening market by market, with the intelligent systems having a dramatic impact on the financial services sector. In commercial markets, traders are moving away from watching the market, plugging data into their models and executing trades. Instead, traders are overseeing the intelligent systems and components that are spotting trading opportunities and will make those trades. This is freeing traders up to help guide and define what makes a ‘good’ trade and help to develop the rule-base that the intelligent systems are using to evolve different strategies and adopt different approaches. Traders are therefore moving away from execution and, instead, taking on more strategic and less tactical roles.
Increasingly, the intelligent systems are not only handling the execution, but they are also spotting opportunities and providing tools and capabilities that make it easier for traders to see these. This is taking strategies that were once the realm of highly specialised teams and making them widely available to the trading community. This is something that we are have already seen across more strategy-driven trading institutions, such as hedge funds, and means that highly quantitative, analytical trading strategies are being more commonly adopted as organisations don’t need a team of highly qualified people building models to identify opportunities. As a result, it’s much easier to have highly intelligent adaptive systems that are supporting people who don’t have specialised analytical skills to use those tools and techniques to support day-to-day trading activity. In the financial services sector and beyond, augmented intelligence is bringing much more sophisticated capabilities and the ability to apply those sophisticated capabilities in more generalised environments.
Intelligent cars
Some would argue that we are now entering a phase of augmented intelligence acting as an ‘extension of a human being’. This is something we are seeing from the likes of Tesla and Google within the field of intelligent or self-driving cars. There is a vast amount of intelligence going into building the capabilities of intelligent cars to identify risks and enable the cars to automatically take action when they identify hazards in order to support the driver. This, therefore, augments the ability of the driver to operate safely and efficiently in those environments allows the car to operate in a much more sophisticated way.
Getting it right the first time
Augmented intelligence has the potential to help businesses and industries realise even greater value, particularly in businesses that use engineering and development processes where it is important to get things right the first time. In these situations, there are big costs involved with going beyond modelling and design, into the production stages and getting it wrong. This is especially true if the business is making tangible goods and there are tooling costs associated with those goods, which means the cost of getting it wrong is high. Therefore, as augmented intelligence helps people produce better quality products and performs a number of checks to ensure they are right, it drastically improves the likelihood of getting things right the first time and, as a result, reduces the cost of bringing new or changed products to market.
We are only just scratching the surface of what augmented intelligence can bring to society and organisations. The ability to really shorten the decision-making process and bring more appropriate information and present it in a more timely manner in a way that hasn’t been possible to date will revolutionise the way in which society operates. As augmented intelligence is adopted by an increasing number of markets, we will see its impact ever more clearly, affecting not only industry but all aspects of our society.
DW talks all things UPS with Power Control Ltd, underlining that this ‘humble’, if crucial, piece of data centre infrastructure is far from simple – both in terms of what it offers now and, perhaps most intriguingly, how it will develop in the future.
1. Please can you provide a little bit of background on PCL – history to date and the major milestones?
Power Control Ltd was formed back in 1994 following a massive surge in the use of technology and dependence on electronic equipment. Our owner and chairman, recognised the requirement for power protection solutions and set up this business to meet this necessity. The 1990s saw Power Control win a number of high profile projects including several large data centres and 46 police fingerprint bureaus nationwide. Rapid growth also saw the company launch its dedicated power maintenance division.
To enable an increased stock holding, the company moved to a larger premises, which also supported the expansion of the business service and production facilities.
In 2007, Power Control became the sole UK distributor for Italian UPS manufacturer, Borri Spa and delivered exponential growth for both companies.
Following years of research and development Power Control launched its own single phase range of UPS solutions. The first year saw almost a million units shipped.
Growing from strength to strength, Power Control successfully completed a number of high profile projects, all exceeding £1m.
Continued work within the data centre arena, led to Power Control’s partnership with Huawei. Power Control is a value added partner and certified level five service partner for the UK.
Bolstering our manufacturer partnerships, Power Control also became the UK modular UPS partner for Legrand in 2017.
2. Moving straight on to the PCL UPS portfolio, what products and services do you offer?
Power Control offers a full range of UPS systems from 800VA – 6.4MVA. Our manufacturing partners for UPS include:
CertaUPS 800VA – 30kVA
Borri – 10kVA – 6.4MVA
Huawei – 25kVA – 800kVA
Legrand – 10kW – 250kW
We also supply a wide range of power protection accessories and generators.
The company has its own engineering team based nationwide to carry out full install and commissioning work as well as a dedicated project delivery team.
3. Specifically, can you tell us a bit about the various UPS manufacturers’ products you sell and support – starting with Certa?
Our predominantly single phase UPS provider, CertaUPS, manufactures turnkey UPS solutions encompassing rackmount and tower systems. Recognised for their reliability, flexibility and affordability CertaUPS UPS systems deliver leading edge efficiency in a compact unit.
Italian UPS manufacturer, Borri Spa, specialises in three phase UPS technology and in particular custom solutions for niche applications and environments.
The partnership with Huawei is incredibly strong. As the UK’s highest level service provider, we have been selected by them to work on a number of multi £1million projects. We are currently in the middle of a very exciting data centre build but unfortunately I cannot disclose any more information at this time.
4. Historically, Borri has been important to you?
Borri Spa is still very important to our business. We work very closely with them in the UK and most recently have delivered 18 custom built 100KW IP54 UPS for a water treatment works.
5. Most recently, you seem to have formed a closer relationship with Legrand – can you tell us about this link up?
The Legrand relationship is still in its infancy but Power Control is thrilled to have been chosen to be its modular UPS partner in the UK. We were selected for our positive influence across the market, service capabilities and technical knowledge demonstrated through the work we have done with Borri and Huawei.
6. Moving on to some industry buzz topics, why do you think that modular UPS systems are gaining in popularity?
The uptake in modular UPS has been on the rise for a number of years and of late the incline appears to be even steeper. I’d attribute this to the demand for more flexible options. The ‘scale as you grow’ concept appeals to all business sizes, especially in this economic climate. And of course there is the adoption of edge computing solutions and micro data centres.
7. Related (or not), the rise of the edge and micro data centres is having an impact on the UPS market?
The UPS market has shifted. Many have chosen to focus their attention on delivering solutions for the edge and micro data centres. That’s not to say Power Control hasn’t – it has. In fact, we have just previewed the CertaUPS MDC micro data centre, which has received a lot of interest. It comes with in built air conditioning, PDUs, UPS and of course rack space.
However, whilst having our eye on the edge requirements, we also remain committed to delivering solutions for other sector applications. The need for standalone UPS is still very much there.
8. And we can’t ignore the rise in interest (and purchasing?) of lithium-ion battery-based systems - taking over from the traditional lead-acid battery technology?
No we can’t – and haven’t. We have recently released 1U and 2U lithium-ion UPS options from CertaUPS. Although a slightly higher initial purchase cost the overall TCO is well worth it. Having a 1U option in our portfolio is also incredibly advantageous, particularly for those with limited rack space.
9. Turning to some end user pain points, what are the (extreme) environmental conditions of which end users need to be aware when specifying and running a UPS system?
UPS systems contain fragile electrical components requiring stable and precise environmental conditions, these are often specified by each UPS manufacturer, to ensure maximum longevity. Key environmental factors include extreme temperatures, humidity, salt air and dust.
10. And how important is it that battery efficiency is maximised in a UPS?
Very important. The longevity of batteries is essentially what keeps the UPS doing its job. Ensuring batteries are regularly maintained is vitally important.
11. Similarly, what do end users need to consider when looking at UPS resiliency/redundancy?
A very simple view of this really is how critical the load is that UPS is supporting. Understanding this in terms of operational and financial impact usually determines the extent of redundancy needed.
12. And what about the importance of proactive UPS system maintenance?
Proactive maintenance programmes, like those offered from Power Control, ensure that users can always be one step ahead of any fault. Preventative maintenance looks at the wider scope of variables that could affect the performance of a UPS. These include adverse environmental conditions such as change in temperature, that a preventative maintenance visit would identify before they cause harm to the system – even if that problem won’t manifest for a few years.
Many businesses fall into the trap of a false economy. Anything that can affect the reliability and performance of the UPS system that is not addressed, negates the purpose of having a UPS in the first place
13. Finally, what are the factors to consider when weighing up UPS system purchase price versus Total Cost of Ownership (TCO)?
The main deciding factor is budget and whether there is a capital budget separate to operating budget - for example if a project has a set allocated budget, and that team are not responsible for ongoing operations of any given installation, the TCO is much lower priority compared to initial purchase price. An operations team, or company with long term budgeting will nearly always care much more about the TCO compared to the initial price. The biggest factor is who is making the decision, end user / specifier / installer / integrator. It must always be remembered that the vendor needs to make money somewhere, and so the TCO is the best method of comparing on a like for like basis the different suppliers. Ultimately it depends on what the end customer wants and what their largest concern is.
14. Finally, finally, what can we expect from UPS systems into the future – for example, how will intelligent automation make a difference?
I’d expect UPs in the future to be able to take multiple different inputs from renewable sources and manage which one is used for supporting the load, and how any excess energy is stored. IA will play a part in the management of the various different inputs for this, and the load being more intelligent will be able to communicate with UPS control to anticipate power demands so the UPS can decide on the most efficient and cost effective power source to use.
We asked a range of industry professionals how collaboration in the IT space needs to develop. This is in terms of both how end user organisations need to get internal departments to work together to ensure faster and smarter new product and services development; and also how it is increasingly important for vendors to work together to provide hyperconvergence/integrated solutions which are ready to work 'out of the box'. Not to mention the importance of the supply chain working closely with the customer. Part 2.
David Keens, Principal Marketing Technologist at Acxiom, comments:
The future development of collaboration in marketing and advertising technology must consider that, underlying everything, much of the data represent real people: their behaviours and characteristics, their family relationships and friendships—their lives. Integrated solutions must manage data in a way that’s transparent to the end consumer and respects the permissions granted. IT collaboration isn’t simply a technical problem to solve.
Collaboration in the IT space has an enormous impact on the brand experience for end consumers. As consumers today, we all expect consistent communications, personalised experiences, and brands to respect the permissions we grant when we share our data. However, to deliver these communications and experiences, brands have thousands of different advertising and marketing technology vendors to choose from. Complicating this further, brands likely have ‘silos’ of data already existing across the organisation. It is vital that collaboration between these vendors creates integrations that ‘just work’. Consider though, this isn’t just about the technology. To create these consistent consumer experiences, internal departments, from product development to customer services, often need to collaborate together in new ways.
To make this collaboration more effective, a key development is defining an open approach to connecting the data signals across these siloed systems, technologies, vendors, and internal departments. This open data layer is built from policies, organisational structures and technology, and can function as a unifying principle. It helps create an environment in which the end consumer, represented by these data signals, is at the heart of the organisation.
Creating a unified and open data layer is challenging. Hyper-convergence and increasingly integrated solutions from vendors can be beneficial for some brands. Many of the larger vendors have invested heavily in the integration between their offerings, and this can help remove barriers to new collaboration. Teams in digital marketing, CRM, customer service and analytics can start to use increasingly consistent sets of data, taxonomies and user interfaces. However, to complete the management of the consumer experience, there are typically still additional technologies that need to be added in to the mix, and this requires specialist systems and data integration skills from internal IT departments and their partners.
There can also be downsides to this convergence, especially if it is excessively led by the vendors. Consumer data is valuable and some platforms slide towards becoming ‘walled gardens’, where a brand’s data about their consumers is out of their reach and difficult to manage. Also, some vendor-integrated technologies will not meet the requirements of a brand: the promises of technology working ‘out of the box’ proving false. Regulations about managing consumer data are complex, and currently evolving at a fast pace. It’s easy to use technology integrations in ways that infringe data privacy legislation and consumer consent. This highlights the importance for brands to collaborate with trusted advisors and consultants who can help navigate this complexity.
Andrew Filev, founder and CEO of collaborative work management platform Wrike, says:
“Almost every enterprise in the world is already embracing a collaborative work management (CWM) tool in some capacity, especially those that have aggressive growth plans for the next 2-5 years. I expect that we will see more companies, large and small, begin to understand the very real benefits of CWMs and expand deployments of CWM technology to more teams, as competitive pressure heats up.
“Cross-departmental adoption of work management tools has traditionally been driven by Project and Program Management offices. IT and Operations will become the major cross-functional champions this year. In fact, Gartner estimates that 70 percent of organizations leveraging CWM systems will report that their teams are significantly better performing by 2022.
“However, I expect an exponential growth in the adoption of a company-wide unified collaborative work management platform in the next three years. This will bring benefits that go beyond just productivity gain in a single department. It will deliver better visibility and coordination from the C-suite to the field.
“The pressure to do more work with fewer resources, and at an ever-increasing pace, will not only require an advanced collaborative work management platform with a robust automation feature-set. It will force the adoption of lightweight project management methodologies beyond the Project Management Office and the creation of company specific Standard Operating Procedures (SOPs) designed for the digital workplace. Furthermore, for those companies undergoing digital transformation, the success of these initiatives will depend on specific repeatable workflows and SOPs because process consistency is the only way teams and organizations will be able to master work at scale.”
Ian Fairclough, VP of Services, EMEA, MuleSoft says:
“Organisations operating in today’s fast-paced digital economy win or lose based on how quickly they can deliver innovation and change. IT naturally plays a central role in this, but all too often gets swamped with requests, falls behind and ends up bottlenecking project delivery. MuleSoft’s 2019 Connectivity Benchmark report found that nearly two thirds of businesses were unable to deliver all of last year’s IT projects. To keep pace with growing demands for faster and smarter products and services, organisations need to rethink the way they deliver IT.
“IT departments need to evolve from being an ‘all-doing’ organisation to an ‘enabling’ organisation, so they can respond to the exponential increases in technology demand and support much broader adoption and consumption of technology that is beyond the capacity of the IT department itself. Rather than serving as an order-taker, IT needs to find a more collaborative approach to delivery, pushing innovation out to the edge, to be driven by end-users. This requires technology assets to be exposed in an easily consumable way, allowing anyone within the business – or third parties – to pick up and build upon existing capabilities.
This is incumbent on changing the dominant mindset that new products and services need to be built from the ground up. Organisations should first consider whether any part of their project has been built before, and if so whether they can reuse any of those components. Additionally, organisations need to ensure they’re building for the crowd and making their assets as simple to discover and use as possible. One way this can be achieved is through APIs. These digital building blocks can further be composed into an application network, where IT assets and capabilities can be plugged in and out easily to create new services.
“This also places huge emphasis on collaboration – whether among team members, between IT and the wider organisation, or between organisations working together. As such, it’s important for users to have the opportunity to give and review feedback on the assets and capabilities being exposed to them through APIs; perhaps via an app-style user rating system. There will also be an educational element, especially in the early stages. IT and tech-savvy staff are likely to be faster to understand and earliest to adopt the approach, but it may take longer for the wider organisation to join the dots between self-service APIs and business success.
“IT should also empower more of the business to deliver their own projects in their own way. Non-IT employees such as mobile app developers, for example, could independently update a mobile app with legacy data by tapping into an existing API. The emphasis should be on reuse and self-service, freeing IT to work on higher-level strategic projects while saving time and resources.
“Ultimately, organisations should forget about building systems from scratch and working in silos, which serve to slow innovation and change. Instead, they should embrace a new approach based on collaboration, allowing them to sprint and truly unlock their potential.”
Martin Morgan, VP Marketing, Openet, comments:
The IT space has long been a battlefield shared by vendors and their customers. Vendors have long sought dominance over this battlefield and as a result, many customers have found themselves using IT solutions and systems that are inflexible, hard to move, and costly to run. Thankfully, this is changing, and the industry is seeing more vendors and customers embrace the principles of collaboration.
IT companies simply can’t afford to stifle their rate of innovation due to vendor lock-in and costly systems that take months and years to upgrade, and that still falls short of a company’s needs. With the rapid rise of internet and web-scale companies, legacy IT companies are recognising the tangible benefits of adopting an open approach to technology development. They are understanding that new approaches such as DevOps and open source are the future of IT development and will be responsible for creating the solutions of tomorrow.
But while this realisation is a positive step forward for the IT industry, it cannot happen in one fell swoop, all of a sudden, all at once. Instead, it’s about changing mindsets, both vendors’ and customer organisations. IT companies need to understand the urgency of the landscape they find themselves in – new services need to be deployed quickly and seamlessly, as end-users constantly ask for more, faster, better services and solutions. This cultural shift should encourage a move away from bespoke, large-scale IT solutions that offer little flexibility, towards collaborative approaches that create ‘out of box’ solutions that can be easily implemented and integrated through the use of open APIs into existing technology stacks, without requiring a complete overhaul. To be successful, this approach will require strong partnerships, between vendors and with their customer organisations.
We’re already seeing the value of partnerships from a number of vendor companies who are partnering to create ‘out of box’ solutions that allow companies to quickly launch and monetise new services without needing to build new bespoke solutions or infrastructure. This level of partnership and collaboration is allowing different vendors to contribute their expertise and knowledge to build solutions that suit the needs of their current and prospective customers.
The current IT landscape is all about speed and efficiency. But IT companies simply won’t achieve the efficiency they need to survive by going it alone. They must form the right partnerships and encourage knowledge-sharing to push innovation forward and deliver the IT solutions the industry truly needs.
Stuart Gilks, Systems Engineering Manager at Cohesity, says:.
"When I look back on the lifetime of my career in technology, the amount of partnership and collaboration between vendors has increased significantly. Typically well-established vendors need to work with start-ups because the latter’s agility and time to develop and innovate is typically much shorter than a bigger firm. Likewise, earlier stage firms benefit from the supply chain, reseller network, customer reach, and brand awareness that more established organisations have.
Customers are also more open to working with earlier stage companies understanding that big innovations and competitive advantage often come from being earlier adopters, but doing that in partnership with an established vendor reduces their risks.
Additionally, with the rise of industry standards, open APIs, interoperability testing and open source technologies, people in the industry are less fearful of opening up to working with other vendors that compliment their offer. It's been a sea-change from how things were; the talk of walled-gardens and vendor lock-in is significantly reduced, and interoperability, portability and integrations are a more common verbiage amongst vendors.
The collaborations and integrations we are seeing arrive now between vendors, and the consolidation and convergence being produced by them is driven by the customer's wallets. Customers don't want to spend money on technologies that require specific and separate infrastructure to run individual applications and functions that bloat the capital expenditure budget sheet and require silos of specialist skilled teams to operate.
The focus is on Total Cost of Ownership (TCO), being smarter with what you have, paying as you use, and ideally not needing any specific additional technologies & skills to function. The ball is firmly in the vendor's court to ensure they come to market with products that play nicely with others, and deliver a returning value that surpasses the price paid, and then some!
As an example, you're now seeing software vendors make their technology hardware, hypervisor and OS agnostic. Over the next few years I see software vendors bringing their apps to data platforms where the majority of customer data already sits, allowing new services, innovations and business value to be delivered from the same infrastructure already delivering 'business as usual' services such as Data Protection, NAS and object storage, Test/Dev clones and data archiving.
Sitting on top of existing security policies, principles and governance framework automatically understood and in-place ensures agility, operational efficiency and innovation
without data or security compromise, as well as the small matter of avoiding additional infrastructure budgets, procurement, implementation and ongoing operational costs!
In short hyperconvergence and industry collaboration is fuelling improvements in both TCO and business value - with the mainly untapped opportunity to do the same with the majority
of customer data assets we have a huge opportunity to continue and accelerate this delivery well into the 2020s.”
80% of sales and marketing leaders are either already using chatbots as part of their customer experience or plan to do so by 2020. While many see automated assistance as a valuable customer service tool, they also give organisations the opportunity to support employees and their use of technology platforms internally. IT leaders should consider this when they look at the potential of chatbots.
By Pete Kinder, chief technical officer at Wax Digital.
Take eProcurement; any organisation that has taken a digital approach to its buying processes could benefit from chat-based support for employees when they use the purchasing software. Like many technology platforms, eProcurement is used across a number of departments and by employees without detailed procurement knowledge. However, chatbots aren’t everyone’s favourite thing. A survey of 5,000 people found that 56% still prefer to speak with a human rather than receiving automated assistance. For this reason, IT professionals should put careful thought into their choice of chatbot technology to boost employees’ effective use of the system.
Given that many IT systems, including eProcurement tools, use Machine Learning and Artificial Intelligence (AI) to improve user experience, chatbots will encourage employees to ask questions and get insightful responses. Some chatbots might take a step further and detect whenever a user needs help and automatically give advice. Advanced eProcurement systems track how people operate the system, and utilise that learning to advise future users on how to perform certain actions better.
For example, the platform might acquire intelligence on how particular invoices are being coded, which then inform users how to insert the statement into the system in future scenarios and avoid simple mistakes. Chatbots allow users to not only receive support in navigating the technology more effectively, but also ask specific questions about the system using a touchless user interface. Employees could enquire about historical spending data over a certain period and ask questions such as ‘How much did we spend on IT software in Q3,’ by simply speaking to the eProcurement system.
It’s not only buyers using eProcurement tools that can benefit from chatbots – suppliers can gain from it too. Businesses often reject invoices due to errors, which delays payment. Using chatbots, suppliers can query why invoices haven’t been paid and the eProcurement system can flag an incorrect invoice; enabling them to resolve the issue more quickly.
As stated, a basic chatbot isn’t enough to support a complex software platform. This is because of how unnatural and limited the pre-programmed responses are. Advanced chatbots can have dozens of responses to one question coded into the software, and AI can determine the right one to use based on the type of user and the emotional state that they’re in. For example, if somebody uses a lot of tech speak, chatbots can respond in the same vernacular and use AI searches to automatically replace words with synonyms.
Similarly, if a chatbot user is getting angry with technology not working properly, AI can detect that in the choice of language and ensure that the chatbot’s response has a tone of voice that doesn’t patronise or aggravate the person further.
eProcurement is used across many departments and has several user types with different privileges. Some might be able to buy items, while others can only retrieve certain information held in the system. The chatbot needs to be programmed to know what different users can view and restrict access accordingly. Without those measures in place, any user from any department can retrieve any information held by the eProcurement system, increasing security risks. For example, if the level of technology expenditure in the past quarter only concerns the IT, finance and procurement departments, parameters can be set to ensure that only they can access the information.
A procurement system that includes a chatbot is a great time saver and helps to improve the use of technology. By ensuring they introduce an intelligent chatbot that gives personal and tailored responses, organisations can be sure that employees and suppliers will regard it as a useful tool and use it to support their eProcurement activity. The introduction of voice-enabled virtual assistants like Amazon Alexa demonstrate that we’re starting to see a culture of employees willing to ask a device questions. Organisations should channel this into chatbot use to make procurement activity more effective and run more smoothly.
It didn’t take long for video games to become a huge cultural force. Now worth roughly double the film industry, gaming is massive – and by its very nature, it’s at the forefront of technology.
By Jessica Smith, Developer, Fasthosts.
Alongside advances in graphics and VR, recent years have seen major developments in cloud gaming – most notably with Google’s recent announcement of Google Stadia, its own cloud gaming service. Also known as “gaming on demand” or “games as a service”, cloud gaming boils down to the simple idea of applying cloud computing to gaming, and it has the potential to revolutionise the way video games are consumed. But the question remains, will cloud gaming rewrite the rules by replacing dedicated gaming hardware? Or, as is more likely, will it carve out its own niche for casual gamers? Or will it flop entirely?
From ZX Spectrums and Commodore 64s to the latest consoles and PC graphics cards, hardware has always been a huge part of gaming. Crucially, this hardware can usually be found sitting on a desk or under a TV. It’s in this sense that cloud gaming is a fundamental shift: instead of running games on a device in the user’s home, cloud-based services do all the processing at another physical location, potentially hundreds of miles away. Of course, cloud gamers still need something to connect to the server. But because there’s no need for on-board gaming capabilities, virtually any internet-enabled device will do, whether it’s a tablet, laptop or smart TV. Specialised, low-powered devices or “thin clients” are also available – gaming-optimised streaming boxes that can, in theory, turn any screen into a high-performance games machine.
The advantages of this cloud model for gaming? The newest hardware doesn’t come cheap: hundreds of pounds for a new console ever few years or thousands for a tricked-out PC, plus the games themselves – and with gaming technology constantly evolving, upgrades are mandatory every few years to enjoy the latest titles at their best. By connecting to devices in the cloud, costs are reduced to a subscription fee, and the provider takes on all the responsibilities of maintaining high-end gaming systems. This enables the cloud provider to leverage economies of scale to deliver higher gaming performance at lower cost than is currently achievable with dedicated hardware at home. Just look at the specs announced for Google Stadia:
These specs are a major jump over the current crop of enhanced consoles (Xbox One X and PS4 Pro). In fact, they are much closer to the expectations for the next generation of consoles which aren’t due to be released until late 2020 at the earliest. While the subscription cost has not been announced yet, with expectations in the region of XXX, Stadia has value on its side. You could not get the same level of hardware for your desk or console for that price. It is also worth remembering that Google has stated these specs are for the 1st Generation of Stadia, so you can expect these to improve over time, further adding to the value.
At first glance then, cloud gaming is just another aspect of cloud computing in general. It is sometimes referred to as the “Netflix for games”, and there are certainly parallels; with millions of viewers now happy to access content on remote servers rather than owning the content on DVD or Blu-Ray. However, due to its interactive nature, gaming is far more demanding than video streaming.
Latency; cloud gaming’s Achilles Heel
The critical piece of the cloud gaming puzzle is latency. A data centre can have cutting-edge hardware running the latest titles at ultra-high settings and framerates – but the user’s experience will always depend on the speed of their connection. Games depend on real-time interactions, so any lag between a button-press and on-screen action is detrimental. Tolerable latency varies across genres, but for fast-paced shooters and action games, it needs to be kept to an absolute minimum to ensure a playable experience.
And this is where cloud gaming has fallen down historically. In the past, bandwidth just wasn’t high enough to deliver responsive gameplay – but now, internet speeds are starting to make cloud gaming a viable option. 5G broadband, with its promises of ultra-low latency, is also seen to be a key enabler for cloud gaming, but even that is still a few years off. It’s important to note that high-speed internet is still far from widespread even in developed markets, where poor connectivity and bandwidth caps are still issues for many users.
It is worth remembering that cloud gaming isn’t exactly new either. Latency is not a new problem. Cloud gaming can be traced at least as far back as 2000 and G-cluster – a service that offered on-demand gaming via set-top boxes. A variety of cloud gaming projects have come and gone since then, notably OnLive and Gaikai, the latter of which was acquired by Sony in 2012 to form the basis of PlayStation Now – Sony’s own cloud-gaming subscription service that lets players stream older titles on their console or PC. In all instances, latency, due to poor internet connectivity, has held these services back.
Fidelity
Unfortunately, network latency is not the only concern that gamers have. A reduction in graphical fidelity due to video compression is also another critical factor. Even at 1080p/60fps, Google Stadia demands a consistent 25mbps connection. Even if you’re lucky enough to have this connection, compression artifacts are unavoidable due to the need to compress the video stream. And with games now pushing 4K resolutions and beyond, compression artifacts are only going to become more noticeable. In a world where graphical fidelity is so important, it may be hard to convince gamers to stream their games if there is any chance of a graphical downgrade in doing so.
Gaming culture
Convincing gamers to back any change to their modus operandi brings us to the final challenge. While strong development of the technology is one thing, cultural approval is quite another. From controversies surrounding “online only” single-player games like SimCity and Diablo 3, to the “always online” debate overshadowing the Xbox One announcement in 2013 – gamers have a tradition of scepticism when it comes to cloud-based services.
Aside from scepticism of cloud services, it is important to recognise that gamers’ have a longstanding attachment to their hardware. Google Stadia may offer top-end specs, but the fact you will never see this impressive hardware is a problem for many gamers. Building a custom rig is a big part of PC gaming culture and an important part of the hobby for many gamers, so they will resist the most to cloud gaming.
This attachment to hardware is not just an emotional one either; it comes back to the latency argument. PC gamers in particular do all they can to eliminate latency in their gaming setup. From wired mice and keyboards, to low latency/high refresh monitors, wired headsets, etc. PC gamers are always looking to shave milliseconds in latency whenever they can. The attention played to latency reduction is obsessive. Asking a gamer to then add another 200ms in latency for the convenience of not owning hardware may be a leap too far.
Is cloud gaming here to stay?
Looking forward then, what are the long-term implications of cloud gaming? To say that it could be disruptive is a huge understatement. The industry is currently built on the assumption of home hardware, with the three main console makers acting as gatekeepers between games publishers and their audience. A cloud-dominated market would make publishers far less dependent on an installed base of machines, and could even lead to each major publisher running its own cloud gaming service. Microsoft and Sony will continue to invest heavily in their own cloud offerings and in-house development studios, no doubt – but will video game consoles even exist in a decade’s time?
You only have to look at what the big gaming companies are doing to see that cloud gaming is where the future lies. We’ve already mentioned Sony’s PlayStation Now service. Microsoft also has its own game streaming service in the works with Project xCloud. And even Nintendo, a far more traditional gaming brand, has established some major cloud inroads on its platforms. Streamed versions of demanding games like Assassin’s Creed Odyssey and Resident Evil 7 – originally designed for Xbox, PlayStation and PC – are becoming available on the comparatively weak Nintendo Switch hardware. If the actions of the major players are anything to go by, gaming on demand could gain serious traction over the next decade.
Hardcore vs; casual gamers: the cloud gaming battlefront
For the moment, traditional consoles are more popular than ever, but it’s not a stretch to imagine them dying out. Cloud gaming is rapidly maturing to the point where it could easily become the mainstream option, even if others continue to exist for enthusiasts. Similar to how film buffs still seek out physical copies and music fans still spend thousands on record collections, specialised hardware will always have a place in the homes of hardcore gamers. But Netflix subscriptions beat Blu-Ray sales, and Spotify users outnumber vinyl geeks. Games, as the newer, more technologically demanding form of media, are just taking a bit longer to make the jump. All the signs point to cloud gaming becoming the new normal – and there’s everything to play for. Providing the additional latency and drop in graphical fidelity is negligible to most users, the convenience of the “Netflix for games” will most likely be too hard to resist for the majority of gamers.
IN the first of an occasional series, profiling what might be described as the ‘youthful element’ of the IT and data centre industry, DW talks to Maryam Ya-Alimadad CEng MIMechE, Mechanical Design Engineer, Sudlows.
1. What were the factors that influenced your choice of subjects studied at school and university – hence your decision to become a mechanical engineer?
I have always been good at maths and enjoyed solving mathematical problems. When at college, I decided to choose a subject where I can apply my maths skills. Hence, I applied for Mechanical Engineering.
2. And now that you’ve become a mechanical engineer, you are passionate about helping others discover this career, via your work as a STEM ambassador?
Yes, I truly am. I believe we need to introduce the younger generation to engineering. We need to make them aware of the work that Engineers and Scientists do, and the broad variety of projects they get involved with.
3. How did you decide to specialise in cooling technology/working in the data centre industry?
After finishing my degree, I started working as a Mechanical Designer for a heating and cooling company, which was mostly focused on residential and commercial buildings. I started working for Sudlows, who specialise in data centre cooling, last year.
4. And how did you come to work for Sudlows?
I applied for a job with Sudlows, who specialise in critical infrastructures, specifically data centres. This was of interest to me for a few reasons: - we live in the age of information, and data centres play an important role in this. I had experience in the field of heating and cooling, and this seemed like an exciting next step in my career.
5. What is it that you like about your job role?
Every project is different and requires a bespoke design to suit. This is what makes things interesting. Every day is different.
6. And are there any downsides?(!)
I wouldn’t call it a downside, but each project has its own challenges, but then again, the challenges are what makes each project interesting.
7. You specialise in cooling technologies for data centres – is every project completely different, depending on location, data centre design, or are there some obvious cooling technologies/ideas that are universal?
There are similarities but there are also differences from one project to another. Depending on locations, client requirements and the nature of the data centre, the solutions could vary.
8. In other words, is there an optimum cooling technology/design out there, or just the one that fits a certain set of circumstances the best?
There are always techniques that can help deliver marginal improvements to designs. One of the key tools we use is Computational Fluid Dynamics (CFD). This sophisticated modelling software can simulate a range of operating scenarios to analyse each individual element of a critical infrastructure, without any risk to the facility. As each project has different requirements and constraints, this modelling technique enables engineers to implement the optimal cooling solution, independent of the design that is chosen.
9. Without breaking any confidentiality agreements, are you able to share with us the kind of work you’ve been doing on some recent projects – how you’ve been helping clients solve problems?
I work on a variety of projects, anything from smaller data halls with communication rooms and switch rooms, which can be served using DX systems to larger data halls with larger cooling demand, which use chilled water systems to provide cooling.
10. And what’s more enjoyable, addressing the cooling challenges of an existing data centre, or designing the cooling for a brand new facility?!
Both have their own perks, with a retrofit job, there are limitations and challenges which could make the job more interesting. With new builds, there is more flexibility in the solution which can be chosen.
11. There’s a renewed interest in liquid cooling – how do you see this playing out over time in terms of its impact on data centre design?
As we are becoming more and more aware of the effects of higher GWP refrigerants, the regulations are understandably becoming more restricting on what refrigerant can be used. However, there is still reluctance in bringing in water/liquid within the data centre environment.
12. And do you think the heat recovery aspect of liquid cooling will be leveraged more and more into the future, or only in a handful of cases?
There are still limitations to the heat recovery aspects of cooling, especially in larger scales. Nevertheless, we are moving towards more efficient and environmentally friendly systems.
13. More generally, how do you see the digitalisation of the data centre having an impact on the cooling requirement – ie AI, 5G, IoT etc. require faster and faster, ‘hotter’ IT hardware?
We are moving closer to a world which will increasingly rely on data centres and IT hardware. I believe there will be more data centres, however, the IT equipment is also becoming more and more efficient, which means the cooling demand may not be as large as one would expect. I believe the general move is towards efficiency more than anything else. There is also emphasis on fast data transfer, hence the edge data centres.
14. And how do you see this trend having an impact on overall data centre design – for example, there’s talk of higher and higher densities housed in taller and taller data centres?
Although I have previously mentioned that IT equipment is becoming more efficient, so densities are not necessarily increasing, it also depends on technologies which are on the horizon that we are not aware of yet. A good example of this is a 5G roll out we are carrying out for a telecommunication organisation, where not only does this new technology incorporate higher density equipment, but some of the innovative technologies that will be energised due to the 5G could have a major impact on densities. Basically, due to technology moving at such a rate it is difficult to predict or trend exactly what lies ahead.
15. Any other thoughts about the data centre industry and/or your role within it?
To reiterate an earlier point, the mission critical environment is extremely interesting due to technologies being continually developed and deployed. Yet what is a source of frustration, for me, is that statistics show that the average age of a data centre specialist is now 54 and with the lack of young Engineers entering this sector, we are going to have an issue with skills shortages. Young Engineers with great potential are not even aware of this global sector and therefore we will have a potential generation gap ahead.
We asked a range of industry professionals how collaboration in the IT space needs to develop. This is in terms of both how end user organisations need to get internal departments to work together to ensure faster and smarter new product and services development; and also how it is increasingly important for vendors to work together to provide hyperconvergence/integrated solutions which are ready to work 'out of the box'. Not to mention the importance of the supply chain working closely with the customer. Part 3.
Mark Sollars, CTO at Teneo, believes that enterprise innovation can be supported by collaboration from unexpected directions:
He explains: “Some enterprises’ need for security and better application availability as they drive innovation really makes them start to think about things like their local Internet break-out. The fundamental question they are asking is: how do I give customers the best access to Internet and then secure it? Take an enterprise that is security-driven in its network infrastructure planning: its in-house team knows that they could do the existing traffic routing but it would be a complex task. By working with a specialist and using technologies such as SD-WAN, it could potentially be easier for the company to run traffic over its networks with a web-based user interface as part of an SD-WAN installation, rather than an on-premise set-up.”
Integrators help in-house IT teams innovate
Mark Sollars of Teneo, the specialist integrator of next-generation technologies, says integrators can also help enterprises’ different in-house teams collaborate to better define their digitization and network performance and security analytics needs.
He comments: “Enterprise IT teams are starting to use discovery sessions with external networking specialists to simplify their network and security architectures. These sessions help in-house teams find ways to route the right traffic to the right network and security analytics tools to provide the desired visibility of network performance and security threats. In our experience, these sessions are most effective when different network and security teams are represented.”
John Fry, Information Security Manager, Rufus Leonard, comments:
“When an IT overhaul doesn’t land or deliver the returns it’s supposed to, it’s tempting to question the technology; but digitalisation is never just about ticking that box saying you’ve modernised your IT infrastructure, upgraded your software or enhanced your website. It’s about creating a more efficient, consistent way to do business. That’s why it’s different for every organisation, and a fundamental part of a bigger strategic roadmap for your entire business. From saving time on routine tasks to making customers happier, the outcome is going to bring rewards for everyone, not just the tech gurus.
“There’s no getting away from the fact that any such major organisational shift will be disruptive. Successful IT transformation requires clear leadership and effective communication of clear and consistent visions based on reasons employees can get behind, influenced by the creation of blended teams to combine insight and skills and create a mutual understanding of the requirements.
“To bring people on the IT journey in this way, you need senior level oversight. The C-Suite is where the narrative is set for the rest of the company. If this narrative is scepticism and uncertainty – or even a complete lack of engagement – this will inevitably trickle down.
“To harness IT in a way that advances your brand, your edge, and your bottom line, you need to truly involve all aspects of organisation from the very outset. People – regardless of department – don’t like having new technology foisted on them. If it’s been developed in isolation, it might not even be fit for purpose. Collaboration at the start gathers the right scope and inputs into the design process, translating them into a more seamless implementation where everyone knows what’s happening and gets behind the change. It also ensures that those responsible for business, information security or project risks are aligned.
“What’s really happening in any major IT project is the beginning of a journey where a new process becomes part of your growth as a business - it’s a shift in mindset, ways of working and organisational structure. The ultimate goal is to have company-wide representation – ensuring everything from requirements, cost and risk have been considered with the reactionary change properly evaluated by all that may be affected. The organisations that flourish in the future will be those that share objectives, communicate clearly, and celebrate key milestones together.”
Infor has a customer collaboration partnership designed to redefine retail management software - the customer is Whole Foods:
"The new retail platform we will co-create with Infor will be unlike anything currently on the market, better leveraging major technology advances to deliver much more value at lower cost," said Jason Buechel, executive vice president and chief information officer of Whole Foods Market. "With Infor, Whole Foods Market has found a partner ideally suited to help us co-create a new retail platform that enables a more efficient, connected enterprise with greater visibility, flexibility, insight, and ease-of-use for our team members - all while delivering a better end-to-end shopping experience for our customers."
Through the partnership, Whole Foods Market is a working lab for Infor development engineers and designers from Hook & Loop, Infor's Manhattan-based internal design agency. Infor teams work alongside Whole Foods Market team members to identify critical process improvements and develop a purpose-built software suite that enables Whole Foods Market to make better and faster decisions, take advantage of modern technologies like cloud and open source, and deliver a better experience back to its customers.
Tom Adams, Director of Product Marketing at Cogeco Peer 1, says:
“It is fair to say technology has completely changed business over the past 20 years and is now integral to business operations, playing a vital part on every aspect of the organisation irrespective of industry. Processes that would previously have taken teams prolonged periods of time to complete, now just take a couple of clicks and are completed in seconds. This should be increasing the productivity and efficiency of British businesses, but this hasn’t been the case and Britain is half as productive now as it was in 1949, according to the Office of National Statistics GDP measurements.
At Cogeco Peer 1, we recently carried out research of IT decision makers in the UK to understand how well technology is serving organisations and what the IT industry can do to help power the potential of UK businesses. One key theme that comes across loud and clear is the need for education from vendors. The research found 55% of IT decision makers admit that they struggle to keep up with the pace of technology and 28% of IT leaders say they feel they can’t keep up with the increasing number of changes available to them. This is worrying and vendors must educate their customers on products and services, only then will businesses reap the benefits of the technology and reach their potential.
The research also highlighted the need for collaboration between IT vendors and their customers. As things stand, UK organisations believe their IT vendors are holding them back. 85% of respondents to our study believing that their organisation would see faster business growth if IT vendors were less restrictive. Businesses clearly want an IT vendor who is a trusted partner that they can collaborate with and can add an extra level of much sought-after service to help organisations grow and thrive.”
HPE Aruba UK CIO, Simon Wilson, comments:
“If we are ever going to fully embrace the idea of a truly ‘smart city’, businesses need to agree that we don’t hold all the answers. Countless vendors have a POV on the topic, suggesting that they have a one size fits all solution that you can just easily integrate and, bang, you’re all set up to be ‘smart’. In reality that just isn’t the case.
“That isn’t to say that vendors who speak about the smart city don’t have a good point, but to really push the boundaries and take things to the next stage, we need to be working together to make our existing solutions better. To do that would mean we need open standards, alongside technology partnerships and collaboration.
“If you were a municipality and wanted to add new services into your smart city – traffic information services, for instance – but your existing hardware isn’t compatible with the software, what do you do? Should the IT team rip up the existing network and completely replace it? That is exactly what cities are currently struggling with, according to an Aruba study, which found that 49% of cities are struggling to integrate older technology with new.
“If our true aim is to create smart city experiences that will evolve with technology, we need an open infrastructure built on open industry standards, open APIs, open source coding and is available to an open network of partners.
“The more open APIs are enabled, the greater the flexibility and speed at your fingertips. Just think, you won’t have to wait for your existing vendor to develop functionality any more, new features can be bolted on using a third party.
“To provide an example of how ‘smart city collaboration’ works in action, take a look at Cambridge University. It uses an open network infrastructure to help create a public access network which is used by not only students and teachers, but also local councils, service providers and members of the public.
“This network is used by thousands of people across Cambridge every day, and functions effectively thanks to many different IT systems. This doesn’t adversely impact consumers who are using the network. Nor does it make them have to jump through more hoops just to access the Wi-Fi, because wherever they are, indoors and outdoors, their connection is uninterrupted, and their login credentials do not change.
“In order to take the next step then we need to stop spreading the rhetoric that one vendor can create all of the above alone and in one fell swoop. It takes time and needs to be a step by step progress. To truly improve the experience and lives of citizens, smart cities must to be built on open foundations, which can only be achieved with the co-operation and help of like-minded companies.”
The future is digital. That’s a mantra no one would argue with and, in recent years, we’ve really started to see how the practical applications of AI and automation can empower us as consumers, streamlining many aspects of our interactions with the organisations that provide our goods and services.
By Andrew White, CEO, Contexta 360.
Through apps, chatbots, and automated services, we can now make secure transactions, change settings and get most of the information we need at the touch of a button, without having to talk in person with supplier organisations.
So far, so good. And on top of this, our digital connection to the company is a two-way street. All this digital interaction is generating vast swathes of data on product and purchase trends that supplier companies can use to hone their services and gain insight into customer preferences.
The “moments that matter”
However, while consumers and businesses alike have embraced the benefits of automation, there are still many situations when the human touch is essential.
Indeed, all that data generated can be used to profile customers and route interactions for optimum efficiency to strike the right balance between automation and human service. For lower value customers with straightforward enquiries automation works well, but for high value customers, with complex queries, human interaction is what’s needed; customers become frustrated by automation and only a human will do.
In fact, 60 per cent of consumer interaction on average is via voice and video, and in the “moments that matter” – when there is an emergency or need for specialist advice - this rate jumps to 83 per cent. These calls are some of the most valuable from a customer service perspective. By the time the customer picks up the phone, it means they need help with something they cannot resolve themselves; they may be angry or need immediate assistance. At that point they expect to talk to someone who can solve their problems quickly and professionally, making immediate judgement calls on the best course of action – it’s the human touch that remains at the heart of customer service. For customer service agents, however, this means that the days of dealing with simple enquiries are over.
However, just because the customer interaction has, at this point, ostensibly crossed the border from digital to human, this doesn’t mean that the process shouldn’t continue to be supported by innovations in digital technology. Indeed, because of the high value strategic insight that these more complex calls offer, capturing and integrating their content into customer service streams to assist call centre agents is becoming a priority for customer-focused organisations.
Supporting the supporters and closing the digital loop
When a call comes in the call centre agent has only between 3-5 seconds to prepare – that’s no time at all in which to absorb any background data available on the customer and context of the call. When they pick up, they may fail to initially understand the customer request, which is likely to be complex. Even if they do, they must then enter it manually into the CRM system. Agents are only human and this is where errors creep in – fatigue, pressure and inexperience can all lead to inaccuracies - and the opportunity to build up a comprehensive “digital signature” for the customer is lost. Furthermore, and most importantly for the brand, the customer’s experience is less than optimum.
This is where many organisations have started using artificial intelligence and conversational computing to close the digital loop. Voice calls can be transcribed into the system with all questions or actions captured via speech-to-text, analysed for understanding, sentiment and topics. In short, it offers digital insight into the conversation, not via the agent’s keyboard, but via the conversation itself – no gaps, just great data.
The emergence of “tuned” speech processing that aligns to specific industry terms, products and phrases with incredibly high accuracy, together with the addition of AI and deep-learning capabilities, are giving call centre agents a real-time edge. These technologies “listen-in” to the call conversation and can detect questions, interpret sentiment, flag up key words, cross-reference with internal data or notices such as product notifications or sales offers, draw in external data from sources such as Bloomberg or other news-wire services and summarise all the previous voice interactions. This helps agents provide a more informed, satisfactory service, turning those critical “moments that matter” interactions from risks to opportunities, at the same time as protecting and enhancing the brand.
Powerful insights that form a competitive edge
The businesses or organisations that deploy this capability are capturing massive insights. They’re going beyond the contact centre as a low-value, transactional service and transforming it into a strategically important differentiator. Agent performance is dramatically improved in quality and complexity of what is being captured within the system of record, or simply by assisting with a summary of the call actions once the call has ended and they hit the wrap-up timer.
This is a premium role, non-scripted, that requires knowledgeable professionals that can react to customer needs and use a range of AI and information systems to deliver excellent service or sales strategy. It is no longer a humble job, it is the centre of a business, as all calls that reach this location seriously matter.
By supporting contact centre agents in real-time with rich and detailed customer history across all channels, from web enquiries, chatbots, voice calls, backed up with contextual insight around the tone and sentiment of those conversations, we make all that data work effectively on a human level when the interaction occurs.
This underlines the fact that we shouldn’t be talking about a binary choice between automated or human interactions. Instead we need to operate on a customer service scale that collects and monitors data at all points – even voice – and uses it at the point it’s needed to deliver a fully 360-degree customer service. In this way we can combine the best features of automation, artificial intelligence and the human touch for optimum effect.
If you’ve glanced at the opinion columns of security industry publications, you’ve probably seen the term “risk-based” floating around, as in “the time is now for a comprehensive, risk-based approach” or “a risk-based approach to security is key to business alignment." However, many of these articles fail to define what exactly a risk-based approach to cybersecurity is. And that’s a problem — without a solid understanding of its meaning, “risk-based” could end up being just another buzzword, and all the benefits it’s supposed to bring about will never come to fruition.
By Jake Olcott, VP Government Affairs, BitSight.
What is a risk-based cybersecurity approach?
If someone tells you their company takes a risk-based approach to cybersecurity, what they mean is that when it comes to making security-related decisions, they consider risk above all other factors.
Risk-based approaches are often presented in opposition to compliance-driven approaches. Risk-based security teams are more concerned with reducing their organisation’s real exposure to cyber attack and data breach than they are about checking boxes or passing audits (though those remain worthwhile goals).
A risk-based approach to cybersecurity is also proactive rather than reactive. Instead of focusing on incident response, a CIO at an organisation using this approach is likely to invest heavily in testing, threat intelligence, and prevention.
Finally, this approach is inherently realistic. The goal of a risk-based cybersecurity program is meaningful risk reduction, not 100% security. That’s important, because the former allows CIOs, CISOs, and Board members to make pragmatic decisions about budget and resource allocation, while the latter requires sparing no expense, even when investments receive diminishing returns.
What does a risk-based cybersecurity approach look like?
A security program that’s fully committed to the risk-based approach will necessarily have a few distinguishing elements.
Continuous Monitoring
Risk-based approaches to cybersecurity rely on accurate risk knowledge. On one hand, that means that one’s idea of risk should be based on facts rather than opinion, trends, or headlines. However, in the fast-moving world of IT security, data must also be up to date. That’s where continuous monitoring comes in.
This approach to security doesn’t leave room for blind spots. That means point-in-time vulnerability assessments and penetration tests that only occur once or twice per year must be supplemented by other kinds of assessments that fill in the gaps.
Security ratings are one popular option for continuously monitoring cybersecurity risk. Ratings can provide insight into compromised systems, security diligence, user behaviour, and other factors that increase an organisation’s risk exposure. These insights are synthesised into one representative number, updated daily, as well as grades in individual risk vectors.
Independent research shows that BitSight Security Ratings correlate to data breaches — companies with a BitSight Security Rating of 500 or lower are nearly five times more likely to have a breach than those with a rating of 700 or higher.
Prioritisation
A truly risk-based cybersecurity program will have a system in place to prioritise security needs based on their relative levels of risk exposure.
Effective prioritisation relies on two key elements: knowledge of the threat and knowledge of the target. That means a security leader running a risk-based program must maintain consistent awareness of the latest and most urgent cybersecurity threats affecting their company, industry, and region, as well as a deep understanding of the systems and data those threats could affect.
With this knowledge in hand, a security leader can determine which projects require the most resources at any given moment. For example, they can say with confidence that pausing work on implementing automated incident management software in favour of updating user credentials and access will reduce the risk exposure of their organisation.
Prioritisation must also be dynamic, based on short cycles rather than monthly or quarterly initiatives. For this reason, prioritisation relies heavily on continuous monitoring tools like security ratings.
Benchmarking
To gain a true understanding of cyber risk, one can’t assess their organisation in a vacuum. Risk is a relative term, and can only be understood in relation to historical performance and the performance of peers, competitors, and industries.
Security ratings are based on externally observable information, meaning they can be used to assess any organisation, not just one’s own. Many organisations use security ratings to gain an idea of the cybersecurity performance of their competitors, top performers in their space, and their industry on average. In fact, these relationships are baked into the ratings themselves.
This method of cybersecurity benchmarking allows security leaders to understand how their organisation is doing in context. For example, using a security ratings platform, a CISO can see that they have a “D” grade in the malware servers risk vector, and understand immediately that they’re performing worse than other organisations in their industry. They can also look at a specific company — say a larger, more established organisation — to see which areas of their cybersecurity program have received the most attention.
How a risk-based cybersecurity approach can save time and money
Compared to compliance-driven organisations or idealistic companies that demand 100% security, an organisation using a risk-based approach can save considerable amounts of resources.
This approach can help an organisation assess the ROI of their cybersecurity projects, and stop spending on tools and systems that aren’t returning value. Many organisations have spent millions on best-of-breed software, only to be breached as a result of user error or an underprepared third party. A risk-based approach can help a company avoid these scenarios.
In addition, this approach can reduce an organisation’s reliance on expensive security consultants and large point-in-time assessments. By using tools to assist with their security performance management, a company can develop the skills to assess and prioritise their security program in-house, continuously.
Most importantly, however, this approach may be better at reducing an organisation’s chances of experiencing a data breach. With the average total cost of a data breach reaching $3.86 million in 2018 ($148 per lost or stolen record), that could mean the difference between survival and failure.
High-Performance Computing (HPC) and its ability to store, process and analyse vast amounts of data in record time is driving innovation all around us.
By Jim Donovan, Chief Marketing Officer, Panasas.
With enterprises increasing their use of emerging technologies such as Artificial Intelligence (AI), machine learning and augmented reality to improve productivity and efficiency, they are looking for the best high-performance data storage infrastructure to support business operations and make automated decisions in real-time. HPC data storage systems rely on parallel file systems to deliver maximum performance and there are two options to choose from: open source or commercial parallel file systems. Opinions abound on both, so it’s worth examining what’s hype and what’s real.
Cost of acquisition – What’s better than free?
An inherent part of any open source product is the fact that its acquisition is free to the user. This is no different with open source parallel file systems such as Lustre and BeeGFS. While there are highly proficient Lustre and BeeGFS architects and developers in HPC labs around the world ready to tackle each system’s complex set-up, tuning and maintenance requirements, enterprise users can become overwhelmed by a system that lacks the manageability and ease of use they have grown accustomed to in their existing IT environment. By the time, Chief Information Officers (CIO) factor in the cost of additional staffing requirements to implement and manage an open source parallel file system, there’s quite a price tag associated with the ‘free’ acquisition.
Here’s where commercial parallel file systems have a competitive edge over open source offerings. Commercial parallel file systems are delivered as plug-and-play systems that offer some of the lowest total-cost-of-operations and ownership in the business. This is due to ease of deployment and simple manageability, which results in negligible administrative overhead. In addition, commercial file systems are capable of automatic tuning and retuning as workloads change, thereby reducing the opportunity cost of downtime.
Confusing customisation with flexibility
Open source file systems allow for an individual implementation of the code, giving skilled users the ability to modify, customise and expand the code’s functionality to meet their organisation’s unique workflows. But are users looking for customisation or do they actually yearn for more flexibility?
If true customisation is needed, enterprise users should assess the type of skill set and number of staff required to successfully modify and support the open source code. If flexibility is the ultimate goal, today’s modern commercial file systems offer dynamic adaption to changing workflows without making changes to code.
Built on industry-standard hardware that allows for the rapid adoption of new technology, commercial parallel file systems are self-tuning solutions, and purpose-built for adaptability and flexibility to handle a wide range of use cases. Users can configure the system to their exact workload needs without overprovisioning any single component. Systems scale without limitation and bandwidth, capacity, and metadata performance can be independently set with granular control.
Elimination of the ‘performance’ gap
Commercial parallel file systems have closed what used to be the performance gap with open source. The performance of today’s open source parallel file systems is on par with commercial portable file systems, which leverage the latest hardware and storage media technology. The ability to quickly scale in increments without interruption and tuning is crucial for commercial applications to stay on track and meet demanding time-to-market schedules. The processing of large and complex data sets with high precision while handling thousands of I/O operations simultaneously is a must for high-end computing deployments in the commercial space, such as computer-aided engineering (CAE) simulation and analysis, energy exploration and drug development, as well as emerging workloads such as AI and autonomous driving.
Performance is optimised and reliably consistent when the software and hardware are pre-tuned, allowing the system to automatically adjust to increasing complexity. This is the case with portable, commercial parallel files systems that have been optimised for, and are in tune with, pre-qualified commodity hardware components. Open source file systems don’t benefit from the same level of seamless integration as they often require deep knowledge of how the storage system works, in order to tune and re-tune it for the maximum level of performance and bandwidth utilisation required by different workloads.
System Maintenance – What does it take to keep things running reliably?
In the fast-paced world of HPC, users are tackling new and complex projects all the time. Data storage is an essential component in guaranteeing business critical deliverables, and solutions that are easy to deploy, manage and scale, have an immediate impact on a company’s bottom line. Simplicity across the board translates not only into low administrative overhead, but a finely tuned, self-managing system in which all the common maintenance workflows, as well as data reliability, have been automated. This means there is no need for enterprise users to worry about downtime, lost data or late-night emergency calls.
Commercial file systems have mastered this ‘lights-out’ operational approach, while many of their open source counterparts still spend a considerable amount of time on day-to-day storage management and maintenance, dealing with the time-consuming, complex, and error prone activities of tuning, in order to optimise the interaction of software and hardware.
Bringing it all together
Today, the need for high performance data storage infrastructure in commercial enterprise cannot be understated. The massive volumes of data generated from emerging technologies such as AI and machine learning is growing exponentially due to the ease of application integration with enterprise business, covering all industries from manufacturing to life sciences. Fueled by hardware innovations and software driven services, HPC data storage systems are allowing enterprises to use new technology to achieve greater levels of productivity and operational efficiency than ever before, and it’s the outstanding performance capabilities of parallel file systems that are servicing demand.
When all evidence is considered, enterprise CIOs who want to avoid potential operational and reputational risk of failure will see that the benefits of choosing a commercial parallel file system strongly outweighs the exposure of undertaking the task to finance in-house resources and build the infrastructure required to implement an open source solution.
DW asked a range of industry professionals their views on open source technology. Is it already mainstream and being used by most, if not all, businesses, or is there still a way to go? Maybe open source will never become mainstream? Here we round-up their houghts on how, where and when open source makes sense and is already being adopted.
Lynne Capozzi, CMO, Acquia, comments:
Marketing is one sector that needs to take advantage of the benefits that open source has to offer. Marketers are constantly looking for ways to optimise customer experiences (CX) for the brands they work for, and open source enables them to take their service to the next level.
Due to its nature, open source platforms empower marketers to tweak and refine the CX in response to customer preferences and feedback. It lets businesses provide a world of online experiences, and best of all, they work seamlessly and fluently.
The collaboration by which open source develops has generated elements that answer the needs of businesses and customers alike. Marketers can integrate them, ‘pick and mix’ style, to generate their own, distinctive and bespoke CX whenever customers meet a brand online. That’s the kind of user experience that will bring customers back, time and again.
For example, marketers may wish to run versions of open source that deliver content directly to smart devices like watches or speakers. Alongside this, they can integrate customer-facing, CX-optimising modules into content management systems (CMS), including optimisation for mobile, social media integration and fast performance. At the same time, they can add to the same CMS a range of elements designed for the back office, e.g. single sign on, best-in-class analytics and real-time reporting.
The larger open source solutions are supported by a widespread, diverse and enthusiastic group of developers, users and supporters, who often interact through forums and elsewhere. That means whatever the issue or question, there is bound to be somebody out there who can help – possibly for free.
It can be argued that marketers and brands are failing to give customers what they want, without even realising it. And those customers will walk away, often the very first time they are disappointed. Now, more than ever, marketers have a range of options at their fingertips that can make the CX more streamlined, effective and personalised than ever. What they need is a platform that allows them to integrate these elements in a modular way, to scale their digital offering appropriately and most importantly of all, to accommodate the as-yet unknown advances of the future. To do that, marketers need open source.
Fabian Hueske, Co-founder, Software Engineer at Ververica, says:
Open Source Software (OSS) has come a long way to become a major aspect of modern application development. Today, the impact of the open source software (OSS) industry is very broad as organisations across practically every industry and size use OSS to some extent. Gartner found that 95% of mainstream IT organisations leverage open source software assets within their mission-critical IT portfolios — directly or indirectly, through commercial proprietary software using OSS libraries. And with more and more companies adopting the ‘commercial open source’ business model, OSS is only set to grow further in adoption in the coming years. Such vendors focus on providing value-added services — such as training, customisation, support or warranties — to corporate users further contributing to the increase of OSS adoption.
In fact, the 2018 Open Source Program Management Survey revealed some interesting findings of how open source software is being adopted and what we can expect from open source in the coming years. More precisely, the survey finds that more than half of respondents (53 percent) across all industries has an open source software program in production or has plans to establish one. The same survey finds that company size is a decisive factor on open source’s adoption, with large enterprises being twice as likely to run open source software programs compared to their smaller-in-size counterparts.
But how, where and when open source makes sense to adopt? Should companies be adopting open source for specific projects or adopt it widely in the organization?
The specific characteristics of open source software such as Apache Flink, make its adoption uniquely positioned for the majority of application and software developments needs. For example, in projects that need greater flexibility, OSS is probably your best bet. Developers can customise the software, adding extra functionality or removing unnecessary parts to meet specific business requirements. Additionally, long-term projects requiring greater stability are the perfect fit for open source since the software is less reliant on a vendor and its developers working on it, but rather on a user and developer community devoted to supporting the technology for the long run. Team leads predicting that they will need support for the development of a product or application can rely on open source software’s alternative support options. That includes support coming from a vendor, a consultancy firm working on the specific technology or the community and the users of the technology who can provide good tips and example use cases. Last but not least, open source technology comes with possible savings for an IT or data department. Although open source can come with billable customer support, this usually tends to be lower than the purchase cost of proprietary software that can sometimes skyrocket your accounting bills.
Rob Whitely, CMO at NGINX, Inc., believes that open source software (OSS) is now mainstream in the enterprise:
In fact, in Red Hat’s most recent State of Enterprise Open Source survey, 69% of the 950 IT professionals they polled cited open source as either extremely important or very important to their IT strategy. A nearly identical percentage, 68%, said they increased their use of open source in the last 12 months. Viewed from another angle, only 1% said open source was unimportant and only 3% decreased usage in the last 12 months.
OSS is found in every vertical and at every stage of enterprise maturity. It powers public cloud computing and is a component of nearly every hardware and software solution. Even if an enterprise isn’t directly deploying OSS, they are using solutions that leverage it.
Why is OSS so mainstream? There are five main reasons.
First and foremost, OSS is nearly synonymous with innovation. It’s beloved by developers, and thus receives the majority of new, creative energy. Accelerated by the cloud and fueled by digital businesses, OSS is the backbone of every major app you interact with today.
Secondly, OSS offsets skill-set gaps. Historically, if an enterprise built an application in-house using a popular language, then the entirety of managing, patching, and maintaining that app fell to in-house developers. What if they quit? What if the language it's written in became obsolete? OSS offsets these risks by creating a community of developers that maintain the code base. Your in-house developers can customise or tune it without having to carry the full development burden.
Closely related to this last point, is that OSS attracts talent. If you’re a bank working on a new mobile app or a retailer trying to build a compelling website, you need to fight for top developer talent. Today's developers don’t want to work on outdated, proprietary solutions. They’re problem solvers and want to adopt and adapt OSS to solve whatever puzzle confronts them. Large enterprises simply can’t compete in the talent war if they’re seen as a closed source shop.
OSS solutions also reduce cost. One of the original values behind open source was that it was free. Why pay for proprietary, packaged software solutions when you can get a viable alternative at no cost? Well, it ends up that as in life, free is usually not really free. You still have to configure and support OSS, but there is no doubt that OSS can reduce the total cost of software ownership.
Finally, OSS is more secure. That may seem counterintuitive, but it goes back to one of the core tenets Linus Torvalds espoused when he created Linux: the wisdom of crowds. Having more eyes on software means more testing, bug fixing, and hardening. OSS solutions benefit from a degree of security rigor that most companies can’t match – either with software developed in-house or purchased proprietary offerings.
Tim Mackey, Principal Security Strategist, Synopsys CyRC (Cybersecurity Research Center), says:
“Open source activity is at the heart of modern application development, particularly when you look at trends relating to IoT, cloud computing and mobile applications. In each of these domains we see a rapid rate of innovation – something which is difficult to accomplish if each every vendor were required to create an entire application stack from scratch. Similarly, IT organization building their own applications, get it done better/faster/cheaper by leveraging open source. The use of open source operating systems like Linux, security libraries like openssl, platforms like Kubernetes and runtime environments like Node.js down to smaller pieces of open source code enables development teams to focus on their unique functionality while benefiting from the expertise of others. While this dynamic often creates the perception that open source software is free to consume, its usage is not without obligations. Those range from obligations imposed by the open source licenses the component authors applied to their software through to the creation of appropriate patch management and security strategies based on which components are selected.”
Using open source to support e-democracy worldwide, as Myfanwy Nixon, Marketing and Communications Manager, mySociety, explains:
One area where open source offers particular advantages is in civic technology, enabling those who are already successfully using digital technology to hold governments and public bodies to account to share their tools with others who would like to do the same.
We build digital tools that give people the power to engage directly with government, and share our technologies so that they can be used anywhere in the world. Almost all our sites are open source, helping organisations in many different countries to run online projects. They range from the Alaveteli codebase that enables people to run their own Freedom of Information (FOI) website to tools for helping people to monitor, understand or contact their elected representatives. There is even a tool which can run in any country, and which requires no coding knowledge. It enables citizens to write to their politicians and publishes the resulting conversation online, and can be integrated into any existing website.
We also provide open data sets, such as EveryPolitician, a repository of open, structured data which aims to cover every politician in every country in the world and which is free to use.
With more than a decade of experience, we’ve accumulated extensive knowledge about what works and what doesn’t in implementing e-democracy. One of the key things we’ve learnt from using open source technology is that it’s not only about the software. Something that has become clear over the years is that everyone working in the civic tech area faces similar challenges. No matter how different our countries’ legislations, cultures or politics, the issues are broadly the same. As a result, we’ve set up online communities which enable people to able to share what they’ve learned or to ask for advice.
We’ve also developed a wealth of experience and actual usage data that ultimately changes the way we build, and develop, our platforms. We understand the fields we work in well (the “problem domain”), whether it’s governmental practice or civic user behaviour, and which is often knowledge that’s not encapsulated anywhere in the programme code.
Furthermore, any established platform must protect against the risk that new changes break old behaviour — something that regression testing is designed to catch. This is especially important on platforms like FixMyStreet or Alaveteli where the software is already running in multiple installations.
By sharing our practical experience through communities and events as well as sharing the actual software and data, we believe open source can become much more valuable and easier to implement.
Stephan Fabel, Director of Products Canonical – the company behind Ubuntu, comments:
For many technology specialists, open source came into prominence about 20 years ago, but its only now that open source is beginning to gain ground in the mainstream. What was once just the tool of disruptive companies is now being relied on by more traditional organisations who are increasingly pushing their developers towards the digital arena.
This trend has been largely driven by high profile acquisitions, such as the IBM Red Hat and Microsoft GitHub deals in 2018, which have cemented the value of open source and its role in the enterprise. Today, open source is seen by many as a necessity for success, with businesses realising the benefits it can bring from a technological perspective, as well as providing a cost-effective option in comparison to proprietary alternatives.
Open source is an economic option but also the key to faster, more reliable innovation enabling businesses to scale their infrastructure to meet business needs with help from the community, who are at the cutting edge of research and development. In fact, we are seeing that the developers who work on the latest technologies in open source are able to accelerate the adoption of these within their own organisations.
Why? Well, developers who are privy to the edge of innovation are freely exchanging ideas in the open source community. The arrival of new technologies such as AI, machine learning, and robotics developments are the product of pushing developers to successfully resolve testing and implementation issues by leaning on the best developers and learning from other members of the community.
The biggest challenge to leveraging open source will come from perceptions of the value this platform can deliver. Despite growing popularity, there are still a number of organisations whom have IT regulations which prevent the use of open source and mandate the use of proprietary software. This reluctance is often the result of poor understanding of open source tools and a misconception that it is unsupported and lacks the required security and compliance, deemed necessary for commercial use.
Ultimately, open source is a primary enabler for all the technological innovations we are seeing now and will continue to be for those in the future. Emerging technologies will be driven by software that capitalises on a collaborative approach, not just from one company, but from a community focused on improving the whole landscape. From smart cities to datacentres, self-driving cars to medical robots, the solutions to these problems are best entrusted to an army of intelligent developers, rather than a select few companies to produce the very best outcome.
Dr. James Stanger, chief technology evangelist for CompTIA, answers:
Is open source already mainstream and being used by most, if not all, businesses?
JS: Open source is already mainstream, but it has been in “stealth mode” for years. When it has been used, it is used behind the scenes. I’m fond of saying that Linux drives IT. In the cybersecurity world, pen testers and security analysts use Linux as their primary toolbox (e.g., Kali Linux, Parrot Linux, bro, Security Onion, THC Hydra). Linux drives the cloud. Google has long used Ubuntu and now “pure” Debian systems for its infrastructure. The majority of systems used in Microsoft’s Azure system have been Linux systems for years. The Android phones and IoT devices we use, and the IoT systems in our cars and entertainment systems and on airplanes are all gussied-up Linux implementations. But due to the nature of how Linux is used, it has remained in stealth mode until the last two or three years. Now, “Microsoft loves Linux.” We’ve seen ChromeOS take center stage. In the IT space, we’re seeing it take a much more mainstream posture. It’s out in the open. In many ways, we’re in a “Linux 2.0” world.
If not, is there still a way to go?
JS: For Linux to become more mainstream, we do have a way to go. First of all, we don’t have enough qualified IT techs and developers to use Linux properly. We’re also seeing businesses struggle to use Linux properly with IoT. Too often, we’re seeing good, strong, mainstream companies embed and use Linux so badly that they’re actually making security issues worse. So from a skills perspective, an IT management perspective, and a software development lifecycle perspective, the industry continues to struggle in making Linux properly mainstream. It’s not easy, but there are companies and organizations that use Linux well. We see this in Germany, for example, where Linux is used as a mainstream solution; it is rolled out properly. We’ve also seen Linux used well in by developers and organizations in Japan and sometimes in the UK and the United States. I’d argue that if we properly upskill our IT workers and get business leaders to properly understand a solid, secure software development lifecycle, Linux will become properly mainstream, rather than improperly rushed to market.
If so, will open source will ever become mainstream?
JS: Absolutely. It is mainstream, if you ask me. Microsoft, for example, has joined the Linux Foundation. We’re seeing every major sector use Linux in business-critical, foundational ways. Banks use Linux almost exclusively. Manufacturing has used Linux for decades for its supply chain, as well as its Industrial Control Systems (ICS) and SCADA systems. The finance, hospitality and restaurant industries use it. From data analytics (e.g., Hadoop) to business intelligence, we’ve long seen Linux serve as the foundation for these industries. However, we often see managed service providers and value-added resellers struggle with Linux. Many Linux vendors have not favored the business model that these resellers and provers have. Once the business models align, we’ll see much more mainstream adoption in these areas.
Leo Craig of Riello UPS and Product Manager for RWE Dario Hernandez make the case why operators should rethink the role of their uninterruptible power supplies when they’re designing a data centre.
UPS systems have tended to play a vital yet clearly defined role in the day-to-day running of a data centre. They’re the invaluable safeguard in case of any problems with the electricity supply, reducing the risk of damaging downtime and service disruption.
In developed nations like the UK, however, major power outages or network crashes are thankfully rare events, meaning the battery backup safety net provided by a UPS isn’t called into action too often. While a UPS is clearly an essential asset providing must-have power protection, it could quite easily be seen as something of an expensive and underutilised piece of a data centre’s critical infrastructure.
But what if there was a way to reduce significantly the upfront costs of UPS systems with the potential to provide revenue opportunities, all while strengthening overall system resilience?
We’re only too aware of the doubts some data centre operators have about energy storage and demand side response. When uptime is your priority above everything else, there’s an understandable reluctance to do anything that could pose even the slightest risk to power continuity. Why endanger your business – and that of your customers – when the rewards aren’t that great anyway?
What we’d say to those with that view is there’s now a viable alternative that tackles those reservations head-on.
Combining Riello UPS’s expertise in uninterruptible power supplies with RWE’s position as one of Europe’s largest energy traders, we’ve come up with a solution that delivers reduced capital and day-to-day costs at the same time as enhancing rather than compromising on reliability.
It transforms a reactive, underutilised data centre UPS system into a dynamic ‘virtual power plant’ that’s proactively working for operators 24/7.
Partnership Powering Change
The concept centres on an adapted UPS fitted with a special rectifier which allows electricity to flow both to and from the grid. This advanced, energy efficient power supply is supported by premium lead-acid or lithium-ion battery blocks fitted with dedicated monitoring and communications software enabling real-time analysis and interaction with the grid.
The mandatory battery monitoring improves reliability by quickly identifying any potential issues, meaning problematic cells can be replaced before they get the chance to catastrophically fail. Compare this to traditional UPS systems using inferior lead-acid batteries, where it’s far trickier to monitor the batteries and you can’t really be 100% certain they’ll even work when they’re called upon.
The battery capacity in our collaborative solution is divided into two parts with very specific roles. Firstly, there’s a section that is only ever used to provide backup in the event of a power failure – the classic safety net. This is complemented by an additional ‘commercial’ section used either for demand side grid services such as frequency response or to avoid more expensive peak-time power charges.
If the worst happens and there is a power failure the primary backup section is used to provide supply, but we also offer any remaining energy in the ‘commercial’ section to top up on the supply, lengthening overall backup time.
Our first pilot plant has successfully been running since last September at RWE’s global HQ in Essen, Germany. The project, which carries a secure load of 100 kW has also been shortlisted in the category for best “Data Centre For Smart City” at the prestigious Datacloud Global Awards 2019.
A second pilot plant here in the UK is also due to be up and running later this year.
Savings Without Sacrificing Resilience
Our ‘virtual power plant’ option offers two main advantages for data centre operators. To start with, RWE covers part of the cost of the more expensive premium batteries, significantly reducing the upfront investment in a new UPS. In addition, RWE takes on the associated risk of trading on the energy market.
Depending on where a data centre is connected to the grid, they could also save up to £6,000 per MW every year through reduced grid charges. Operators can optionally tap into DSR mechanisms such as Firm Frequency Response (FFR), which offers payment to companies that can reduce power consumption or turn up generation to help ensure a stable grid frequency within 1% of 50Hz.
With the network needing an average of 800 MW of FFR capacity and new flexible sources required in the future, there’s a sizeable and consistent demand that mission critical sites can take advantage of.
The following calculations are indicative only, as all costs are project specific, but they offer an insight into the potential significant savings on offer. They are based on a data centre with a 1 MW load plus batteries providing 10 minutes autonomy plus 1 MWh of commercial segment.
Initial Capital Costs | Conventional UPS | Riello UPS & RWE Solution |
UPS (including Comms Card) | £190,000 | £190,000 |
Batteries (including Cabinets) | £160,000 | £80,000 |
Installation | £22,000 | £22,000 |
Commissioning | £3,500 | £3,500 |
Total | £375,500 | £295,500 |
Total CAPEX Saving = 21% (equivalent of £80,000*)
Annual Operating Costs | Conventional UPS | Riello UPS & RWE Solution |
UPS Maintenance | £4,500 | £2,200 |
Remote Monitoring | £1,500 | £800 |
Total | £6,000 | £3,000 |
10-Year Operating Cost | £60,000 | £30,000 |
Total OPEX Saving = 50% (equivalent of £3,000 per year*)
* Please note all figures are illustrative only and all actual costs and savings are project dependent.
If you’re a data centre operator facing up to the prospect of replacing a legacy UPS any time soon, there’s a compelling case to make a ‘virtual power plant’ part of those plans.
Subsidised batteries lower the initial capital investment by around a fifth. Sophisticated monitoring software helps slash ongoing maintenance costs, which could add up to tens of thousands of pounds over the typical 10-15 year lifespan of a UPS. There’s potential for substantial grid tariff savings plus the opportunity of earning additional revenue from grid services. All this plus enhanced reliability too.
With more and more of the country’s future energy requirements set to be fulfilled by a combination of renewables and battery storage, we can help data centres ditch the doubts and transform their UPS from a reactive fall-back policy into something that’s proactively saving them money whilst still carrying out its pivotal role of ensuring power continuity.
Artificial intelligence (AI) has huge potential for wireless networks and for the people that must protect — as well as those who try to attack — them. It’s a rapidly changing landscape, and in this article, I explain how our industry is most likely to be affected by AI this year and what’s shaping up for the future.
By Thorsten Kurpjuhn, European Security Market Development Manager at Zyxel.
Defining AI
In our context, AI is the development of computer systems and software that can replicate processes usually requiring human intelligence. In other words, AI imitates fundamental human behaviours using predictive intelligence based on big data such as, movement (robotics), hearing (speech recognition) and vision (object recognition). However, AI can or will — at least theoretically — at some point exceed humans’ capabilities in these areas, which makes it simultaneously exciting and terrifying. As it stands, AI is far from becoming truly ‘artificially intelligent’ and has a long way to go in developing both emotional and logical intelligence beyond data analytics.
Cybercrime and AI warfare
Cybercriminals are always quick to exploit the latest in technology and AI is no exception. We are already facing a cybercrime pandemic and this will worsen during 2019 as cybercriminals become more sophisticated and organised. Cybercrime is no longer the domain of lone hackers, it has become a huge business with sophisticated operating models and low barrier to entry.
The organisation of cybercrime is now so extensive that wannabe cybercriminals don’t have to be technical experts. AI allows them to use very targeted, automated tools and these may even learn as they go, getting incrementally better at causing harm. It’s becoming more common for malware to contain nasty surprises such as sleep timers that cause it to open minutes or even days after the file has been declared safe, or the ability to detect and respond to mouse movements.
Small and medium-sized businesses (SMBs) with limited security resources are likely to be most vulnerable. However, everyone is at risk as AI-powered crypto-viruses and other forms of malware proliferate and are deployed with pinpoint accuracy.
AI warfare, which is effectively industrial or political espionage, or competitive intelligence gathering enacted by computer intelligence, is another rising threat. Even the German parliament has fallen victim to this. The implications for AI warfare between businesses are substantial and 2019 is likely to see many ramp up their cybersecurity arrangements to combat it.
The biggest lesson to be drawn from this is that many traditional security measures are no longer good enough. AI works like the human brain: it learns, it develops, and it grows. No firewall or out-of-the-box virus checker can compete with that. In 2019 we must all move on.
Advanced Threat Protection
Advanced Threat Protection (ATP) will become more widespread in 2019, thanks to the superior protection that it offers against AI-based threat.
ATP provides real-time monitoring and protection of the network, which is crucial when threats are increasing, frequently novel, able to infiltrate and spread within a network at lightning speed and incredibly difficult to get rid of. The need is to detect and silo threats before they have any chance to deploy.
Businesses can’t afford to wait for their firewall or virus checker’s next upgrade if the threat is in the here and now. Real time protection and surveillance is all-important.
Cloud computing, combined with a more virtuous application of AI, gives ATP another edge. Machine learning allows it to understand and thus detect evolving threats. The more data it has (drawn from the business or businesses using it) the better it does. Cloud computing allows this knowledge to be aggregated and shared, creating an ATP that gets better by the hour.
ATP — previously a specialist tool — will move into the mainstream this year.
Sandboxing
Sandboxing is a crucial part of ATP, but not all sandboxes are the same. The best now watch activity at the processor instruction level, detecting and blocking malware (including zero-day events) before it is deployed. What’s more, current sandboxes use the power of AI to share information with cloud-based ATP and associated networks, so intelligence is quickly shared and everybody benefits, almost immediately, from better protection.
As a result, the firewall is more or less obsolete and sandboxes (and wider ATP systems) are rapidly replacing it. That change will accelerate during this year.
What are the practical implications?
For SMBs, the growth of AI and its potential applications for both good and ill demand a move to the cloud.
Local security solutions just don’t cut it any more: businesses desperately need the protection of ATP and sandboxing, but they need it in the cloud because that’s where meaningful volumes of data are aggregated, and protection evolves in response to that.
AI allows tech to cross-check inputs and events to understand threats more fully. Systems can then make meaningful predictions and mitigate threats effectively in real-time using machine learning. Just like human understanding, the protective system learns and grows.
When this type of machine learning is applied to an ATP system, everybody who is protected by that system benefits from the threats that they — and others — have already dealt with. That learning might have occurred a year, a week, a day or even ten minutes ago: AI can use all of it, fast.
We are not yet at the point where ATP and sandboxing can replace all other security measures, but in time they will. Right now, savvy organisations are using them alongside other solutions where required.
Many SMBs will be frightened by the growing threat of cybercrime, and rightly so. Just one successful malware attack can bring enough financial, reputational and legal damage to terminate a business. But with advanced, cloud-based and above all, AI-driven security, the future is looking far brighter for business than it is for the cybercriminals.
Although hackers are turning to banking trojans, cryptomining and other increasingly sophisticated attacks, ransomware is still a top threat for businesses. Hackers have upped their game by attacking more frequently but demanding less ransom, and the problem may be bigger than we know – especially in Europe.
By Ryan Weeks, CISO, Datto.
According to Datto’s latest European State of the Channel Ransomware Report, businesses in Europe are now suffering more from ransomware attacks than their global counterparts. The survey found that more than four in five (84 per cent) of European managed service providers (MSPs) had seen attacks against their SME customers –a higher percentage than on all other continents. In addition, 42 per cent of European MSPs reported that their small and medium-sized clients had suffered multiple incidents in a single day. Again, this was higher than the global average of 35 per cent.
A worrying trend when you consider how crippling ransomware attacks can be especially for smaller businesses. The associated system downtime costs them around £26,300 on average, which is 12 times greater than the actual ransom requested. No wonder many businesses never recover fully from an attack.
With ransomware infections in the cloud also continuing to increase – 49 per cent of cloud-based malware infections now target Office 365 - it is more important than ever for organisations to implement a solid data protection strategy. A central part of this is a business continuity and disaster recovery (BCDR) plan that minimises the risk of being unable to access business-critical systems, also known as downtime, and loss of sensitive information. This also goes a long way towards meeting rigorous compliance requirements, such as those introduced by the GDPR.
While the disaster recovery part of a BCDR strategy focuses on the speediest possible recovery of affected data and systems, it’s the business continuity part that ensures operations can continue as smoothly as possible during the ransomware attack, to minimise disruptions and the impact of lost revenue.
Smaller businesses in particular are under threat from ransomware, but they often lack the resources and expertise to build and maintain an effective BCDR strategy. Outsourcing the task to a managed service provider can solve this headache, as Abbots Care found.
As a care agency that provides home care services to almost 1,000 patients in Hertfordshire and Dorset, Abbots Care must meet rigorous data security standards. The industry is heavily regulated by the Care Quality Commission; companies that don’t comply with the strict requirements can be criminally prosecuted.
However, like many other businesses in the healthcare sector, Abbots Care relies on technology. In order to roster its patients and support the care workers who look after them, the company has to securely handle and store large amounts of sensitive information, including patient health records and their care plan needs.
If the technology were disrupted or failed, the people cared for could suffer. If care givers weren’t able to access critical data, patients would have unmet needs, such as missing medication or meal times, and this could have potentially catastrophic results.
With the well being of the patients as its biggest priority, the care agency needed a reliable solution to minimise any risk of system downtime and loss of sensitive data. At the same time, the company wanted to continue focusing on its core business of delivering care, rather than building extensive in-house IT knowledge, so it turned to managed service provider Fifosys for support.
After carrying out a full risk audit and mapping how data was stored within Abbots Care’s IT systems, Fifosys proposed a combination of Datto SaaS Protection for the company’s 500 Office 365 email accounts and Datto SIRIS, a fully featured platform for backup, recovery, and business continuity for local, virtual, and cloud environments – all delivered as an end-to-end managed service.
The implementation process took just two weeks. This included consolidating Abbots Care’s business data into a single environment and implementing policies and controls on how employees should store data. User training was not required as all the backups are run by Datto’s automated solution and monitored by Fifosys.
Fully managed 24/7/365 by Fifosys, the BCDR solution now provides comprehensive protection against any form of data loss, whether from malicious attacks, user error or ransomware attacks. It also helps meet rigorous data security standards, guarantees maximum protection for sensitive patient data and enables compliance with the requirements of the GDPR.
All data is backed up every two hours to a highly secure UK datacentre. Automatic snapshot verification ensures that the backup data is valid and sound, so Abbots Care has the ability to recover from any system outings within less than an hour. A physical appliance at the company’s headquarters protects data on the local servers and speeds up recovery times even further so that the head office team can continue working within 15 minutes in case of an incident.
Built-in, patented Screenshot Verification automatically validates the integrity of backups, essentially running a mini disaster recovery test every single day. Any problems are flagged to the Fifosys operations team so they can deal with them immediately.
Since the implementation of the managed solution, Abbots Care hasn’t had any system downtime and has been able to scale its business, confident that its IT infrastructure is safe. Should the systems go down – due to a malicious attack, user error or other external factors – all business critical data can be recovered fast and Abbots Care’s employees can continue to provide uninterrupted care services to their patients.
What’s more, the care agency doesn’t need to spend any time on ensuring its data is adequately protected: It can rely on round-the-clock expert support for one single managed solution that, uniquely, covers all three key areas of backup, business continuity and disaster recovery. Backups happen automatically in the background without any disruption to the employees, and with an MSP monitoring the solution, Abbots Care has complete peace of mind that its customer and business data is fully protected at all times – even if ransomware strikes.
By Steve Hone CEO and Cofounder, The DCA
Today there is no denying we live in a very data-hungry society, we currently consume an estimated 250 terawatt hours (TWh) of energy delivering online digital services globally each year, which Swedish researcher Anders Andrae from Huawei predicts could increase to 1200 TWh by 2025.
IDC equally predicts that the collective sum of the world's data will grow from 33 zettabytes this year to a 175ZB by 2025. That’s a compounded annual growth rate of 61 percent.
These headlines are daunting, however, it’s the role of the DCA as the Trade Association for the data centre sector to educate consumers and policy makers. It’s important they understand what is driving this unprecedented demand and what the data centre community is doing to reduce the environmental impact associated with the growth.
Whether you are a colocation provider or hyperscaler, irrespective of whether your data is hosted locally or in the cloud these facilities which collectively make our digital world possible all have one thing in common, they only exist to serve consumer demand.
If you don’t think you are one of these consumers just ask yourself how often you shop online for goods and groceries from eBay, Tesco, Amazon which 5 years ago you would have gone down the shops to buy? When was the last time was you purchased a CD or DVD as opposed to simply downloading or streaming instantaneously from the comfort of your own home? To back up this question I did a quick unscientific survey and found 95% of my friends and colleagues don’t even own a CD or DCD player anymore (the only one that did was my Dad!) which says a lot about the rapidly changing habits of consumers as we race head first towards 2020.
As a collective we should all accept that although somethings may well be easier, quicker and cheaper to get online there is always a reckoning and a price to pay for this convenience. With than in mind, I read there has again been increased pressure from Green Peace for data centres to be held to account for their carbon impact and the amount of energy they increasingly consume. Having established that as consumers we are all partly responsible for this data growth, I have to ask why data centres are being singled out. Fifteen years ago when every office worker had their very own PC under the desk keeping their feet warm it was virtually impossible to work out how much total ICT energy was being consumed, over time we have progressively moved our IT to more energy efficient shared infrastructure platforms offered by data centres and I wonder if by doing so we have simply painted a target on our backs making us an easy target to blame as a collective?
Now I would be the first to admit that there are still plenty of legacy data centres in operation that could and should be doing far more to reduce energy wastage, a fact backed up by the latest Uptime Annual Data Centre Survey which now sets the average PUE[K1] at 1.67. Of the 1600 data centres surveyed I am sure you will find a good mix of villains as well as heroes who are doing all they can to reduce energy wastage in their facilities; so ‘taring’ the entire data centre sector with the same brush for something we should all be taking responsibility for does seem a little unjust.
Today, providing data centre/ hosting services is big business and the need to deliver a “5 star always on” service has never been more important. Ensuring these services are delivered in the most reliable and affordable way possible not only makes good business sense but also reduces utility operating costs which creates the biggest incentive of all for operators given the amount of energy they consume.
Unless we collectively curb our data use (which seems unlikely) there is no denying that the continued growth in online services will inevitably result in more energy having to be set aside to support this demand. It is therefore the responsibility of not only the data centre sector but also the server manufactures and software developers to work together to ensure that every Kw of ICT energy consumed is done so as wisely as possible, this way both consumers and the planet will make sure they are “getting the best bang for their buck”.
As the Trade Association for the data centre sector the DCA continues to work with suppliers, providers, policy makers and the DC community to promote the adoption of energy efficiency best practice around the globe. If you would like to find out how you could benefit from closer collaboration, please contact us.
Thank you for all the contributions by members in this month’s journal, next month the theme is another hot topic ‘The World of Cooling’. Your thoughts and views on this subject would be appreciated and valued, deadline date for related articles and copy will is 26 June.
By Bobby Collinson, MD Noveus Energy
During the last 12-18 months the volatility in the energy markets has increased considerably and is set to continue. This coupled with the continual increases and changes to the value and the structure of non-energy charges means that data centre developers and operators need to consider more than just the price when negotiating an energy contract.
In our experience of managing energy for some of the largest data centres, operators and developers, the following are some of the most important considerations when sourcing an energy deal from a supplier:
Fixed or Flexible Contract
The first decision faced by many is whether to opt for a fixed or flexible price contract. There is no right answer to this question but with increased volatility and the uncertain nature of load growth in data centres the flexible contract approach mitigates risk of buying at the wrong time and offers the greatest flexibility of increasing load without penalties.
The decision on which type of contract to opt for is material in ensuring the optimum energy price is obtained.
When to Buy
In commodity markets, the general rule of the price being cheaper the closer to delivery does not always hold true for gas and electricity. Having analysed the markets over the last 20 years there is no real trend or time of the year that is better than any other.
With that in mind, whether buying through a fixed or flexible contract. A strategy of what price point you would like to secure you price at is a must. The strategy should be dictated through a combination of budget considerations, market conditions, risk appetite and future tenant requirements and ideally set through a workshop with key stakeholders.
If adopting a flexible contract approach, it is essential that any strategy is dynamic and changes with market conditions rather than being fixed at the outset and remaining in place for the duration of the contract – markets change and so should the strategy.
Credit
Credit can be a major problem for early stage data centres and often the supplier’s decision on a credit fail is accepted as fait accompli and usually a large deposit is required. Credit as with price can be negotiated and there are several options that can reduce the value of the deposit or eliminate it altogether. It is essential to produce a coherent story around tenants, data centre growth, procurement strategy, debt and financing and to engage with suppliers openly and early in the process.
Non-Energy Charges
Non-Energy charges now make up more than 50% of the energy bill and are still treated as though they are non-negotiable. It is true that if your contract starts on 1 April and is for 12 months duration the scope to negotiate these charges through a full tender is limited as most charges are published. However, 75% of the market negotiates contracts in October and each supplier forecasts these charges and take different views on them for 12/24/36 month durations. This presents an opportunity to negotiate and fix some or all of these charges regardless of whether opting for a fixed or flexible contract. This should be treated as an active part of the risk management strategy in the same way as commodity price.
Buying Green Energy
Most data centres are keen to buy green/renewable energy. In most cases there should be little or no price premium associated with green energy as suppliers utilise their existing renewables contracts. There is an opportunity to reduce the cost further by negotiating a reduction on the cashflow element of the renewables premium that are paid monthly as part of the non-energy charges against when the supplier pays this to the government at the end of the year.
In addition, it is worthwhile speaking to specialist suppliers in the renewable space where some energy exposure can be hedged directly with producers or through a private wire directly to a renewables provider if there is one nearby.
Getting the Basics Right
All the advice above assumes that the basics of energy procurement are done well, as a guide the following if executed well should ensure you have a fit for purpose energy contract.
1. Forecast load projections as accurately as possible for fixed price contracts as in most contracts the volume cannot be changed mid contract and you may incur penalties for over/under usage.
2. Ensure all charges are included in a fixed price deal as some of the smaller suppliers pass through some of the costs. In addition, be aware that a fixed price contract can be amended if there is material change to regulation.
3. Understand what is most important to you as a business when evaluating a flexible contract as only 3 elements of the contract can be evaluated quantitatively. The remainder of the terms are qualitative and will be bespoke to what is most important to the datacentre. A scoring matrix can help to analyse these elements.
There are also a number of considerations outside the purchasing and the ongoing management of an energy contract to enable developers and existing operators save costs, as a summary these are detailed in the charts below based on real life conversations with our data centre clients. Use blue and yellow data centre one pagers based on real life examples
John Booth, MD Carbon 3IT
It’s been over ten years since the EU Code of Conduct for Data Centres (Energy Efficiency) was first created and it’s been a fantastic success story, there are now over 370 organisations signed up to participate (adhering to the best practices and reporting energy data) and over 250 endorsers registered in the scheme.
In addition, it has spurred other countries and organisations to develop their own schemes based upon the best practice criteria, it has been a global catalyst to reduce energy consumption in data centres.
All new builds, and let’s face it, there are a substantial amount of new builds! will have the EUCOC DNA in their designs and will adopt all of the best practices they can without necessarily signing up to the scheme.
It has stimulated innovation, especially in the fields of free cooling and liquid (immersed) cooling and many of the best practices have trigger development both technical and within legislation.
It can also assist in compliance, especially in the UK with ESOS, CRC, in the past and the replacement scheme that came into force on April 1st 2019, the Streamlined Energy and Carbon Reporting (SECR).
The EU Code of Conduct for Data Centres (Energy Efficiency) was originally launched back in 2008 and has been regularly updated every year since, the latest version (v10.0.1) is available on the EUCOC website on this link https://ec.europa.eu/jrc/en/energy-efficiency/code-conduct/datacentres together with the supporting documentation (I’d advise anybody planning to use it, to read all the documents prior to implementation.)
The EUCOC has a best practices committee that meet once a year usually in the autumn to review sections 10 and 11, which cover potential future best practices, and items under consideration, effectively, best practices that may be added the following year and those that still need some work or are subject to other external influences (mostly in energy efficient software development)
It also reviews the best practice section proper to update, edit or remove any best practice that is no longer a best practice or has been subject to information from other parties, for instance if ASHRAE or CEN/CENELEC/ETSI were to update their documents then the appropriate best practice content would be updated as well.
You will note that the use of ISO standards was made MANDATORY in 2016 for ISO14001 and ISO50001, being Environmental Management and Energy Management Systems respectively, the reason for this, quite simply is that BOTH these standards are required under other legislation across the EU, Energy Management as part of the EU Energy Efficiency Directive (ISO50001 exempts organisations from ESOS) and that ISO 14001 can exempt organisations from other environmental legislation (too many to list, country dependent)
Updates for 2019 included a requirement for site documentation (astounding that some organisations don’t have current drawings, plans and operation and maintenance manuals) and training (again, the level of training in some facilities leaves a lot to be desired) (refer to the EUCOC for guidance).
This committee is drawn from industry experts working for data centre owners and operators, consultants, and supply chain and is VENDOR independent, the most recent meeting took place in Ispra, Milan, Italy in 2018.
A further recent change is that the EUCOC is now a technical report under the EN50600 series of data centre design, build and operate standards, its full title is PD CLC/TR 50600-99-1:201X. (X being the year of publication)
Both versions of the document are similar, except that the TR has a different format and separates the “optional” best practices into a separate section.
The TR is published after the publication of the EUCOC and only after approval from the various national standards bodies.
Endorser/Participant?
There are 2 methods to get involved, the first is as an endorser, this means that you promote the code to other organisations, develop products and services or policies that reflect the EUCOC core principles, or require the use of the EUCOC in your own or a 3rd party facility perhaps via tender or procurement processes. There are currently over 250+ endorsers, including IT equipment manufacturers, large hyperscale operators, software manufacturers, consultants and supply chain, a full list can be downloaded from the EUCOC website.
The other method is to implement the EUCOC in your facility and participate, participation requires the completion of the application spreadsheet, the recording of energy data and the regular annual update of progress (if you are starting your EUCOC journey) against the action plan and energy data.
There have been over 370 applications to become a participant, these are reviewed by the EU-JRC and also published on the website, participants range From Global IT Systems Integrators, single site colocation companies and enterprises.
The EUCOC has some 150+ best practices covering management, it equipment, cooling, power, other power, design considerations and finally monitoring and metrics.
It is without doubt a cultural and strategic change tool as it makes organisations THINK about how they plan, design, deliver and maintain IT systems within the data centre/server room environment and the wider organisation, after all, all IT systems have a user interface (accessing data), transmission systems (routes to the data) and a repository of data (the server room/data centre) all of which require components to be located securely, with adequate power, network and cooling support systems.
Sadly, from my own experience, organisations rarely use the code for this purpose, preferring to adopt those best practices that fit into their own energy efficiency analysis or perhaps from recommendations arising from the Energy Savings Opportunities Scheme (ESOS) and thus mostly of a physical or tangible nature (things you can touch) rather than the policies, processes and procedures (things you can’t touch and are more theoretical)
Useful?
The usefulness of the EUCOC really depends on your viewpoint, if we assume that data centre environments are divided into 2 main types (and I appreciate that there 5 as per table below), being the enterprise (i.e. a sole organisation has full control over all aspects of the environment) and colocation (essentially a car park for servers, network and compute, where you purchase space, power and cooling) then the usefulness is dependent on the level of control you have.
The EUCOC has a very useful section that splits the “operators” into 5 categories and then assigns the endorser/participant status recommended.
Control | Operator | Colo Provider | Colo Customer | MSP in Colo | MSP |
Physical Building | Implement | Implement | Endorse | Endorse
| Implement |
M&E | Implement | Implement | Endorse | Endorse | Implement |
Data Floor/Air Flow | Implement | Implement/ Endorse | Implement/Endorse | Implement | Implement |
Cabinets/Air Flow | Implement | Implement/ Endorse | Implement/Endorse | Implement | Implement |
IT Equipment | Implement | Endorse | Implement | Implement | Implement |
OS/Virtualisation | Implement | Endorse | Implement | Implement | Implement |
Software | Implement | Endorse | Implement | Implement/Endorse | Implement/Endorse |
Business Practices | Implement | Endorse | Implement | Endorse | Endorse |
|
|
|
|
|
|
MSP = Managed Service Provider
An example of a combined endorse/implement best practice would be the installation of IT equipment into a hot/cold aisle layout data centre, both parties should implement it themselves and endorse it to the other party (ies)
Another example is that a colo provider will only need to implement the best practices that relate to the building, the M&E, Data Floor/Air Flow and Cabinet/Airflow, whilst a colo client would use the best practices that relate to the IT equipment, the OS, Software and Business Practices, each should “endorse” the EUCOC to the other parties of the best practices that they themselves have no control over.
The EUCOC is intended to be a guide-book for a journey from a legacy data centre, think PUE 3+ and costing a small fortune in energy costs and probably not providing the level of support in the 21st Century to a fairly state of the art data centre PUE sub 1.5, fairly energy efficient and flexible and nimble enough to support the business.
Sadly, as mentioned above I don’t think that the guide-book is sufficient, you really need a guide, this guide can be internal, probably someone who is responsible for compliance issues, or an energy manager who should have completed the EUCOC/Energy efficiency training course (available from most data centre training companies) or alternatively an external resource skilled in the subject.
John Booth is the DCA energy efficiency steering group Chair, sits on the BSI TCT 7/3 committee (EN50600), reviews EUCOC applications on behalf of the EU-JRC, and is the Global Lead Assessor/Auditor for the Certified Energy Efficient Data Centres Award (CEEDA), he has assessed some 250+ EUCOC applications, and conducted 60+ CEEDA assessments, the most recent being in Lincoln (UK) and Finland. He is also the Technical Director of the National Data Centre Academy and can be contacted on john.booth@carbon3it.com or john.booth@nationaldcacademy.com
By Julie Loveday, Energy Consultant
The Energy Savings Opportunity Scheme (ESOS) is a mandatory energy assessment, used to identify energy savings. Organisations that are captured by ESOS, must carry out assessments/audits every 4 years. These audits measure the amount of energy used by: their buildings, industrial processes and transport, and are designed to identify cost-effective energy saving measures, which the organisation may wish to implement. The implementation of any identified saving measures is not mandatory under the scheme. In the UK the scheme is administered by The Environment Agency.
ESOS applies to large UK undertakings and their corporate groups. It mainly affects businesses, but can also apply to not-for-profit bodies and any other non-public sector undertakings that are large enough to meet the qualification criteria.
In terms of the qualifying criteria, a large undertaking is defined as:
As the measure of turnover in the scheme is defined in Euros, it will be necessary, if your accounts are quoted in pounds sterling, to use the Bank of England exchange rate between the euro and pound sterling at close of business on the qualification date 31 December 2018.
The reference period for the Phase 2 window began in January 2018 and the deadline for the submission of compliance to the EA is the 5th December 2019. This must be overseen by a qualified ESOS Lead Assessor; of which there is a finite number.
There were over 6800 organisations captured in Phase One and a finite number of qualified lead assessors (circa 650).
Organisations that are captured and do not comply, may incur penalties ranging from publication of the organisations non-compliance on the EA’s website, to a £50,000 fixed penalty plus £500 per day.
If you have any questions and are captured by ESOS, LG Energy Group (LGE) would be pleased to discuss the most effective route to compliance. Please contact the Partnership Relationship Manager; Julie Loveday mobile: 07384 469 930 or the Partnership Relationship support team on Tel: 0161 641 1943.
LGE have several in-house ESOS Lead Assessors. During ESOS phase 1, LGE completed ESOS compliance for more than 70 organisations. The sectors ranged from retail, commercial, pharmaceuticals, transport, data centers and manufacturing facilities. This resulted in the audit of more than 400 sites. Three of LGE’s clients were randomly selected for EA audits and have passed with flying colours.
Please see full government guidance source; https://www.gov.uk/guidance/energy-savings-opportunity-scheme-esos
LG Energy Group Contact;
Julie Loveday – Data Centre Alliance Partnership Relationship Manager & Energy Consultant
Telephone: 0738 446 9930 Email: Julie.loveday@lgegroup.com
Ted Pulfer, Enterprise and End User Consultant at Keysource
While the data centre industry continues to go from strength-to-strength, addressing the environmental impact that an estate can have remains one of the biggest talking points for operators. Ted Pulfer, enterprise and end user consultant at Keysource, explains some of the measures businesses should consider to reduce their carbon footprint.
In an age of heightened environmental awareness, especially where energy consumption is concerned, no business sector in the world can afford to rest on its laurels. For those in the data centre industry, where large amounts of power are required to run today’s digital world, it is even more pertinent.
Recent reports suggest that hyperscale data centres, often consisting of larger estates powered by one operator, already account for over two per cent of the world’s greenhouse gas emissions. The emergence of 5G technology means this consumption is likely to continue and further research also shows that an anticipated increase of internet connected devices could result in the sector using around a fifth of the world’s energy by 2025.
A greener future
Clearly more needs to be done, and the IT sector as a whole has often been accused of putting talk above action when it comes to addressing environmental issues. But options exist, whether simple or radical, that can make a real difference.
The EU’s best practice guidelines for data centre energy efficiency, which is compiled by a broad group of vendors, consultants and professional bodies, is a starting point. It outlines the areas of responsibility operators should bear in mind for their estates and the general policies that apply. While useful, however, it doesn’t need to be the only route businesses consider.
Renewable energy can also play a big part in future plans, and many large tech firms have already started mapping out their targets by using it. Take Facebook, which has committed to using 100 per cent renewable energy in its data centre estates by 2020.
Google, on the other hand, has used natural resources to power its 86,000sq. ft Hamina data centre in Finland, utilising 72 Megawatts of wind farm and making use of the surrounding sea water to provide its entire cooling provision, making its facility almost carbon-zero.
While most mid-market operators will struggle to afford to make such drastic changes, they can make a start by installing small-scale measures for self-generated renewable energy from wind farms or solar as part of the data centre estate. These alone can reduce CO2 emissions by up to 30 per cent compared to conventional generation.
Perhaps the most exciting example of sustainability within the industry can be found in Sweden, however. The ‘Boden Type One’ data centre uses machine learning software to reduce server energy consumption by 20 per cent and renewable energy for power, fresh air for its cooling system and has been built with a timber ‘eco-friendly’ design in mind.
Its aim is to become the most energy- and cost-efficient data centre in the world. Learnings from its first year in operation will likely prove hugely useful for the sector when they’re released in March 2020.
The business of sustainability
Some might see it as burdensome and costly, but sustainability actually makes for an appealing business case, both in terms of return on investment and total cost of ownership. Options do exist and at a time of heightened awareness, those in the industry need to be doing all they can to make their estates more green.