Category Archives: BLOGS

World War on covid-19 pandemic

We are living in unprecedented times in 2020. Just as we ushered in a new decade with optimism, the world was gripped with a virus spreading its tentacles and distributing across the entire globe. This was just another crisis for me having already seen the mayhem caused by dot-com crisis, 9/11 attack, SARS epidemic, 2008 financial crisis and H1N1 swine flu in my career. However alarming was the rapid distributed nature of the crisis caused by the coronavirus contagion which engulfed nearly all the countries of the world after having its first epicenter at Wuhan city in China. At alarming levels this was declared as a pandemic by World Health Organization on March 11th of this year.

However this crisis was not something that was completely un-anticipated. In fact some of the world’s best thought leaders and ambassadors like Bill Gates who had been warning the world for the last few years of the next big crisis caused by an epidemic rather than a nuclear war.

The next outbreak? We’re not ready | Bill Gates

Ted Talk 2015 by Bill Gates Source: YouTube

Unfortunately, the warnings were not taken seriously by the authorities and the world leaders. Infact there were regressive measures taken like the disbanding of the Office of Pandemics and drastically reducing budgets allocated for agencies like CDC. Even the World Health Organization is seeing funding cuts from donor nations and ramp down in budget allocation. In an era where we hear buzzwords like “digital transformation” and “data is oil”, it is astonishing that important decisions like healthcare budget cuts made are not data driven. The billions in budget cut are now going to cost the global economy trillions of dollars.

The pandemic had clearly shown that our healthcare systems are not designed or architected for this kind of distributed crisis. A lot of myths and biases about robustness of healthcare systems in developed nations had also been broken by the pandemic. The swiftness of its spread in this hyper connected globalized world had taken us by a surprise.

As the world gets ready to fight this new world war against the unseen enemy, there is a need for us to win as many battles as possible. It is important for us to convert the weaknesses in our current systems into our strength. Technology and data can be leveraged to make a difference to the current situation.

Some important points on how technology can help the healthcare system at this moment:-

Scalability of Testing:

We all have been hearing from epidemiology experts the secret to fight this pandemic….. Tests, Tests and Tests ! However the availability of the testing center and the number of tests has to scale in exponential proportion to make this possible. A traditional and centralized model of test data input, processing and test result output can only scale linearly. It is limited by design.

This is where using distributed solution capability can help organizations to achieve elastic scaling of their testing capabilities exponentially and have quicker turnaround time for test result output. Distributed Computing using the capabilities of edge, cloud and datacenters with high speed connectivity can make this elastic scaling of testing possible.

This can provide the scalability in a pandemic scenario by allowing smaller clinical pathology labs to collaborate with larger central laboratories. By interfacing laboratory equipments with the distributed cloud network, the pathologists can finalize reports from any remote locations globally. It also helps patients to easily access their reports online without exposing themselves to visit the lab.

Augmentation of Testing:

Some of the countries in forefront of the war against the virus realized early on that there would be need for augmenting testing methods and systems. One of the country South Korea used artificial intelligence to augment the regular testing methodologies. Terenz Corp, a South Korean company works on AI-based decision support system for critical healthcare diseases to improve the quality of life of the people by detecting the high-risk diseases at an early stage and monitoring them with the Terenz platform. To detect Covid-19 in minutes, Terenz developed AI-based screening system that can detect the COVID-19 using chest X rays within a few seconds with an accuracy of 98.14%.

For detecting COVID-19 and classifying the data into Normal, Pneumonia and COVID-19, an AI engine leveraging Convolutional Neural Network was developed. The CNN algorithm developed performed pretty well by plotting an overall accuracy of 98.141%. Read more about the solution from Terenz over here.

Extensive use of Data Analytics:

Taiwan, leveraged its national health insurance database and integrated it with its immigration and customs database to begin the creation of big data for analytics; it generated real-time alerts for a clinical visit based on travel history and clinical symptoms to aid case identification. Even though the country was only 81 miles off the coast of mainland China it was not as hard hit as other nations in the region as decisions were made based on historical data systems architected after the SARS epidemic. Taiwan merged the national health insurance database with its immigration and customs database. By merging databases they could collect information on every citizen’s 14-day travel history and ask those who visited high-risk areas to self-isolate.

Looking back into historical pandemic data can assist in making lot of important decisions and also in predictive analytics. In the current scenario due to the high infection rate the contagion is compared with historical data from the 1918 Spanish Flu, which caused 39 million deaths wiping out 2% of global human population. Important lessons can be learnt from the 1918 pandemic in terms of macroeconomics, progressions of the virus and mortality numbers.

Even though the Spanish Flu started from China then spread to Europe, North America and South Asia the highest mortality rates were seen in British India. The highest rate by far was for British India, cumulating to 5.2% fatality during the pandemic. China’s death rate was not nearly as high, but because of its large population, it contributed significantly to the number of global deaths. The US had a cumulative death rate of 0.5%, with an associated number of deaths of 550,000.

Deep learning using neural networks can be trained using the datasets from Spanish flu, SARS and Covid-19 for training of various predictive models for better decision making. This actionable intelligence can make a difference in decision making to nations like India which had suffered the most in a similar novel viral outbreak a hundred years back.

Secure Data Exchange:

This war would require the coordinated sharing of data between various entities (agencies, states, countries, global bodies and private sector). One of the biggest hurdles in the healthcare industry is sharing of data as it is highly regulated with compliance laws like HIPAA, Hitech Act, MACRA, GDPR and Chain of Custody. The data would be required for quick test bedding, development of vaccines, track herd immunity, clinical trials and critical decision making. Processes for data lifecycle can be automated via RPA(Robotic Process Automation) mechanism. The data can be used to develop applications like Contact Tracing App as developed by Govtech from Singapore. It can be the apps and data labyrinth. The consistency of the application workloads and support for various data or file formats will be critical for stable input and output systems.

Also it is necessary that the data systems are high available for access 24*7 across distributed research locations. Disruption in availability of data systems and failure to comply with service level agreements(SLA) can seriously hamper the progress of rapid medical research and innovation. Technologies like Big Data Cluster from Microsoft Corp or any open source system can be the molecular component of such healthcare data system.

It is equally important to make sure the data exchange systems are secure with encryption at various levels and multi layer security enabled. For remote workers VDI (virtual desktop infrastructure) could be a mechanism to access the data systems. This will make organization like WHO and other similar agencies as a secure data gatekeeper.

Optimization of Supply Chain Management:

The healthcare staff and hospitals, treating covid-19 are facing severe shortage of personal protective equipment and ventilators. The healthcare systems were not planned for a pandemic of this proportion.

OEM’s and other companies are rushing to make sure that their assembly lines are running at full throttle. However complex designs of medical equipment’s consists of multiple components from numerous vendor sources. Industry 4.0 ensures that supply chain function provides integrated operations from suppliers to end healthcare consumers. This is so critical for the delivery of vaccines to all 9 billion humans residing all over the planet so that vaccines can be safely transported at the earliest to the distant corners of the planet using the cold chain storage capability.

With effective implementation of automation with 3d-printing, IOT, blockchain and AI, the supply chain for Industry 4.0 can achieve its key performance indicators to roll out the essential medical equipments and gear at a faster pace than ever before. Real time planning gives more flexibility in this situation and accurate performance management makes sure that the information reaches from high level KPI’s to granular process endpoints like the real time location of a component. The data from the various sources can also help the trade department or agency of the governments to make sure that procurement of components can happen to enable local production near the virus epicenters. Blockchain can be effectively implemented to make sure the supply chain data veracity is maintained. Startups like PencilData are making sure data veracity is maintained while securing from invisible cyber attacks on supply chain data.

This could be a long war going forward but if there is a coordinated response globally with effective use of technology systems then it could tilt the war towards us. We definitely have the smart technology systems of the 21st century as weapons in our arsenal to gain victory over the deadly invisible enemy !

DISCLAIMER: The information in this document is not a commitment, promise or legal obligation to deliver any material, code or functionality. This document is provided without a warranty of any kind, either express or implied, including but not limited to, the implied warranties of merchantability, fitness for a particular purpose, or non-infringement. This document is for informational purposes and may not be incorporated into a contract. Drootoo assumes no responsibility for errors or omissions in this document. Contact us at [email protected] for more information on the topic.

(The blog will be updated regularly during this challenging period.)

About organizations mentioned in the article:

Drootoo is a Singapore HQed company specializing in distributed cloud systems.

McKinsey is a global consulting company.

Pencil Data is a Silicon Valley based company specializing in data security.

Terenz Corp is a South Korean company specializing in AI based decison support system for critical healthcare diseases.

Microsoft Corp (MSFT) is a global leader in software, data and cloud systems.

World Health Organization (WHO) is the leading global health body.

Center For Disease Control (CDC) is the agency in charge of epidemics under United States Government.

Taiwan CDC is the agency in charge of epidemics for Republic Of Taiwan.

Govtech is the agency in charge of government technology for Republic of Singapore.

High availability federated authentication for Office 365 in Azure? Drootoo can help.

From your current on-premise IT infrastructure, you have decided to take baby steps and move to the cloud for taking advantage of the various benefits it affords in terms of expenditure treatment, optimal use of resources, lesser cost of operations, operational flexibility and a lot more. Email management with Exchange or other mail servers and their integration with the existing active directory, productivity applications, messaging and other communication applications has always needed more resources than felt necessary. Many organizations have hence made the leap to the cloud with Office 365 or Google Suite. Hence, the question as to how such domain users can be enabled to access Office on the cloud.
                From a web application perspective, it needs the users to be authenticated in order to access their data. When it is an enterprise web application, integration with the in-house identity management solution is called for. In Windows environments, this is the Active Directory. In Office 365, “choosing if identity management is configured between your on-premises organization and Office 365 is an early decision that is one of the foundations of your cloud infrastructure”. Please note that once the choice is made, reverting to another choice takes a lot of work in this regard. The various options including the scenarios they are suitable for is documented at https://docs.microsoft.com/en-us/office365/enterprise/about-office-365-identity
                Unless this is a trial of Office 365 or where there is No Active Directory or where there is a Very Complex On-Premises Active Directory that one doesn’t want to work with, the choice for large enterprises is to integrate Office 365 by using federated authentication. For a more detailed decision tree, please review the document at https://docs.microsoft.com/en-us/azure/security/azure-ad-choose-authn

 

It is always a good practice to test the desired implementation, and see for yourself the effort involved, whether it works with your environment, and how it all comes together. https://docs.microsoft.com/en-us/office365/enterprise/federated-identity-for-your-office-365-dev-test-environment has the steps to create the required test environment, along with configuration for the participating servers and O365 portal settings. Once testing is completed successfully, deployment options can be considered based on the usage of O365 services. In organizations with heavy usage of productivity and communication applications, ensuring high availability for that is a given. https://docs.microsoft.com/en-us/office365/enterprise/deploy-high-availability-federated-authentication-for-office-365-in-azure has the steps to deploy a high availability federated authentication for Office 365 in Azure.

 

The steps involve virtual machines in a single cross-premises Azure virtual network (VNet). Further, highly available Cross-Premises and VNet-to-VNet connectivity needs to be established. Towards this, one would expect the VPN gateway to handle that, however this is to be noted – “Every Azure VPN gateway consists of two instances in an active-standby configuration. For any planned maintenance or unplanned disruption that happens to the active instance, the standby instance would take over (failover) automatically, and resume the S2S VPN or VNet-to-VNet connections. The switch over will cause a brief interruption” – https://docs.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-highlyavailable

 

For establishing a dual-redundancy: active-active VPN gateways for both Azure and on-premises networks, you create and setup the Azure VPN gateway in an active-active configuration, and create (at least) two local network gateways and two connections for your (at least) two on-premises VPN devices with the result being a full mesh connectivity of (at least) 4 IPsec tunnels between your Azure virtual network and your on-premises network. The same active-active configuration can also be applied to Azure VNet-to-VNet connections by creating active-active VPN gateways for both virtual networks, and connecting them together to form the same full mesh connectivity of (at least) 4 tunnels between the two VNets.

 

How Drootoo makes this a Snap!?

With Drootoo core services, Provision Cloud Resources is a single, simplified section from where the required cloud resources on Azure can be created for this purpose of integrating on-premises and O365 for user authentication and identity management. Compute options can be used to create the desired VM instances, among the regions exposed by the cloud provider. Our Network options enable creation of Virtual Network, Gateway and VPN connections required to complete the task.

One also wonders whether this collection of configured options can be made available in a cloud resource template like that of AWS CloudFormation. Our new innovation, Drootoo Blueprint, is a  provider agnostic way to provision a collection of resources on the cloud. In this case, a single Drootoo Blueprint can be created with the required resources by the Active Directory and Network experts in an organization. It can be reviewed by the technology management chain of command. Once satisfied with the desired configuration, the Drootoo Blueprint can be Launched to provision the collection of resources on a single/ multiple/ mix of cloud service providers. The Drootoo Blueprint is available for reuse, along with options for version control.

Our future vision is to enable solution providers, system integrators and other organizations to create, share and reuse Drootoo Blueprints, thereby enabling organizations with limited technical resources to simply select and deploy the required cloud resource solutions for their businesses.

 

How much does it cost to run your own hybrid cloud?

Hybrid cloud is the concept which evolved during the turn of the last decade when companies had their apprehensions of adopting and solely relying  on public cloud services like  AWS.

The major concern of the companies on public cloud  were primarily around the following points:

Security

Vendor Lock-in Concern

Data Location

Regulatory Compliance

Connectivity

In 2017 nearly 70% of cloud users are using hybrid cloud. Hybrid cloud consists of a private cloud offering which is run on premise and there are some workloads run on the public cloud. To run a cloud offering on premise there is requirement of capital expenditure (capex) to setup the cloud infrastructure. This expenditure will be part of total cost of operations (TCO) incurred by the business to run its IT operations. Whereas the the workloads  that run on public cloud will be based on rental based subscription  part of operational expenditure (opex). Thus hybrid cloud setup  includes addition of both opex and capex costs to run its operation.

In this blog we will discuss about the sample breakup of minimum cost required to run a datacenter with a 8U standard rack. Lot of these costs are hidden in nature and needs to be carefully understood before any implementation else the value of cloud will be negated.

Hardware & Software Costs

 

Rack

Racks are required to host the hardware running a private cloud setup.

Server

Servers are the computing power that is needed to serve a cloud offering.

Networking

Networking switches and routers are core components for traffic and data movement.

Storage

Storage devices like NAS or SAN are used for data storage to run a private cloud offering.

Host/ Guest Operating System License

Operating System licenses need to be factored to run the host system and the guest virtual machines in a cloud offering.

Virtualization &  Cloudstack Software licenses

Private Cloud solutions cannot be build without a virtualization or hypervisor  layer like vmware, hyperv or kvm’s. This is a core component for today’s cloud technology. On top of it you need to run a cloud solution which could be open source like openstack or licensed versions from vendors like citrix or microsoft.

Rental Space Costs

Rental space cost is generally not accounted but if your datacenter facility is in prime business location then this does count for significant monthly recurring cloud infrastructure cost.

Cooling Costs

To run cloud systems smoothly and avoiding overheating of hardware there is requirement to have efficient cooling systems.

Power & Backup Costs

To run any datacenter and avoid disruption of cloud services it is necessary to make sure that there is non stop 24*7 power supply. in order to avoid power disruption from the grid there is requirement for backup UPS system and generators to be in place. Without significant investment in this segment there will be high chance of catastrophic failure of business application being served from the cloud. This will also affect any Service level Agreement for the cloud offering.

Security Costs

Security is a prime component for any cloud or datacenter offering. For private or hybrid cloud businesses need to factor cost of multi layer security deployment. It starts from base physical access security to online security including firewall devices, ddos equipment, network and edgepoint security to prevent any breaches to the cloud datacenter.

Labor Costs

To run a private or hybrid cloud offering it is requirement to have highly qualified cloud architects and system administrators. These are one of the most highly paid hourly or monthly job roles in the IT industry today.

Maintenance Costs

The complexity of running a private or hybrid cloud setup increases the maintenance cost. This is not always factored in during a cloud setup process. The hidden cost is significantly higher when using open source solutions as very highly competent technical team is required to maintain them as escalation point is not available.

Below is an illustration of the breakup of costs to be incurred while building a bare minimum hybrid   cloud:-

       Type of Costs    Monthly Cost  Annual Cost
Minimum Basic  Costs

(4-6U Server, Networking, Storage, Rack & Software)

 

$7,000  $84,000
Rental Space Costs

(250-500 square feet)

$833 $9,996
Cooling Cost $1250 $15,000
Power & Backup Cost

(Power Bill, UPS & Generator)

$1000 $12,000
Fire Suppression Cost $216 $2,592
Security Cost $119 $1,428
Soft Cost

(Labor & Maintenance)

$3,000 $36,000
Total Cost $13,418 $161,016

Keeping the above mentioned cost factors in mind while building your own hybrid cloud will be beneficial to businesses. Drootoo can help businesses to save upto 70% of the TCO of running a hybrid cloud and yet address all the concerns of using public cloud services. Please write to us at [email protected] for more information and discussion.

Do you know your cloud ?

Gartner_2017_State_Of_Cloud

Do you know your right #Cloud? #Gartner has recently published a Magic Quadrant for Cloud Infrastructure as a Service #IaaS, highlighting the various strengths of the public cloud providers.

#AWS: World market leader, common choice for strategic adoption appealing to customers that desire the broadest range of capabilities and long-term market leadership.

#MicrosoftAzure: Appeals to customers employing a multi-cloud strategy and are committed to #Microsoft technologies.

#GoogleCloud: Positioned as an “open” provider, emphasizing #portability as its key value proposition.

#AlibabaCloud: Market share leader in china, with an impressive ecosystem of managed service providers and #ISVs.

Source: Magic Quadrant for Cloud Infrastructure as a Service, Worldwide.

With #Drootoo unified cloud platform, we enable a network across different hyperscale cloud providers. The single interface, seamless integration of these cloud providers by Drootoo will allow you to break out of a non-networked silo and enjoy a host of possibilities for your business with a scaleout infrastructure architecture.

drootoo_architecture

Reach out to us if you will like to find out more! https://drootoo.com/contact.html or mail us at [email protected]

Edge to the Cloud – A data odyssey!

Let us imagine the year 2021. We see not just flying planes but flying people and flying drone traffic. And yes they fly on their own very smartly without colliding against each other and dropping off the sky. Least to say the roads are packed with smart cars of all sizes. Interesting isn’t it? But this is certainly not science fiction, it is going to be a daily fact of life.

Flying_City

How will this happen? This will be a completely AI driven planet. Machines that will not require human intervention to perform their mundane operations. They will communicate with each other and will make smart decisions on their own.These smart machines will have inbuilt collision avoidance and image recognition systems. These systems will have running convolutional neural networks and long short term memory network models running in their embedded systems. There will be real time capturing, process and actionable output driven activities generated. All in one smart decision support system.

However these cognitive flying objects will require huge amounts of real time data processing. To process this huge amount of  big data objects the systems will have embedded GPU systems in them. The challenge will be that this kind of real time processing of AI algorithms will not be possible in the cloud due to wireless latency to transfer data even with 5G systems deployed everywhere. So most of the realtime AI operations will have to take place in the embedded systems of the flying machines. Once the machines are backed to their docking stations that is when all the flying data will be archived to the cloud. The cloud will also be instrumental in controlling fleets of these smart vehicles and storing data for historical analysis. It will also be the central brain for large decision support systems.

This real time data processing at the source edge and storage in the cloud is going to be called edge to cloud networks(E2C). Architecting and developing systems that will support optimized E2C operations will be key and frameworks have to be defined. But indeed these are going to be interesting times for distributed system architects.

E2C_Architecture

Figure 1: This figure explains the high level E2C architecture.

How the cloud is evolving ?

From the 1940s the world had been transformed by the different generations of computer technology . There had been massive paradigm shifts in the fifth generation on how computing is performed. From large centralized  mainframes to the PC -internet era ushering in democratization of computing power to the current mobile and cloud era.

Distributed computing had been used for some time and web hosting had been a popular technology to serve information on internet through web applications on browsers .  There had been evolution in the forms of peer to peer ,network ,client-server and grid computing . However in mid 2000 cloud computing took shape in form of NASA’s Open Nebula and Amazon AWS .Cloud computing had been divided into 3 parts public , private and hybrid computing. Public cloud is where services are rendered over a network that is open for public to use. It gives the option to consume compute ,storage , platform  ,networking and other resources to be consumed from a remote location .

Since 2006 when the term CLOUD was officially used with the introduction of Elastic Compute Cloud  by amazon , there had been a paradigm shift towards the the way IT is consumed. This helped businesses to move from a traditional IT CAPEX model  to OPEX subscription model.  However public cloud did not get picked up in the same manner as it was expected by 2010’s . The main concerns for businesses were security , compliance , governance ,knowledge competency and vendor lockin issues.

To mitigate the risks related to public cloud there was evolution of the strategy of private and hybrid clouds . This again required businesses to rely on building and maintaining  of on premise datacenter to host private cloud or backup environments. The mix of public ,private and hybrid cloud strategy increased the cost of IT as a mix of capex and opex was always going to be on the higher end and tough to control.

However in 2015 the public cloud technology has evolved to be more robust with public cloud providers  securing their systems  and addressing challenges of public cloud consumption. The market had seen more strong players entering the scene like Microsoft , GOOGLE , IBM , Rackspace , Red Hat and Oracle .  This gives rise to the next level of paradigm shift in cloud computing  to have businesses adopt a true public cloud strategy . This is  Multi Cloud Computing .

Multi Cloud Computing helps businesses to have complete 100 % datacenter hardware free  IT operations. You can use the public cloud where you can run your production with one cloud vendor and your test/dev environment with another. You can use public cloud itself as your high availability and disaster recovery strategy rather than having an on premise setup . Even distributed environment can be used for the purpose of expanding businesses in geographies where one cloud provider is not present . Even super computing power can be leveraged through a multi cloud model to gain high performance and optimized cost for projects like genome sequencing , simulating big bang theory , understanding earthquakes,mapping the blood stream , predicting climate changes  or even testing weapon systems.

At DROOTOO we are providing a seamless unified cloud platform to businesses to have a smart and secure way of performing multi cloud computing. This is the true democratization of cloud computing era !