- The Essential Guide to Digital Transformation
Chapter 6
The Beginner's Guide to the Emerging Technologies
Date:
7 April, 2024
Fifth Generation (5G) cellular
Firstly, let's us put 5G into perspective
If 2G phones let us send SMS text messages. 3G let us upload pictures. 4G let us watch video. So what’s the big deal with 5G? Actually, there’s not one single feature that will define 5G yet. Because there are many! Some applications have been around for a while just waiting for the network technology to catch up before they can become as mainstream as instant messaging, digital photography, and streamed movies. 5G is that network technology.
A few examples: virtual and augmented reality video; high-speed gaming without a console; remotely driving vehicles on public roads; surgeons performing operations with robots in rural clinics from the comfort of the city hospital; intelligent video cameras improving the policing of street crime. 5G is all about speed. It brings massive increases in the speed of your Internet connection and it vastly reduces the response time of whatever it is you are doing. We call these characteristics “bandwidth” and “latency”. One goes up, the other goes down.
5G isn’t an upgrade of one single technology. It is about connecting faster. We can break it down into four main components and all four of these need to be upgraded. These components are: (1) your phone, (2) the antenna it talks to, (3) the fibre-optic cable in the ground, (4) and the core network that manages everything.
1. Your Phone
So, let’s start with the first component. Your phone. Yes, you will need a new phone. That phone must contain a new radio chip that communicates 5G signals over 5G radio frequencies. But other than that there will likely be no difference to your existing smartphone. Same screen, same apps. It may feel a bit quicker when you swipe between apps and websites because the response time over the 5G network is improved but you’re not going to notice an immediate difference to your mobile phone service compared to your 4G phone. Look at it like this: you can already watch live video on your 4G phone in great HD quality today and any doctor would tell you your eye would not see any increase in picture quality on that small screen if it was sent as Ultra-HD 4k video over 5G.
But over time, you will see new virtual and augmented reality apps that take advantage of the 5G network. For example, you could be sitting in Wembley stadium watching a football match and using the Sports Channel AR app on your 5G phone to see player details superimposed over the live broadcast. Or you could even insert yourself into the football or basketball action on screen to make a cute video for Instagram.
2. Antenna
The second component is the 5G antenna that your new 5G phone connects to. We call this the Radio Access Network (RAN) and it uses radio waves to talk to your phone. These 5G antennas are quite different to the mobile antennas that you see on top of buildings today. The 5G antenna will be a rectangle about 70 cm by 40 cm. Inside is a grid of tiny antennas that we call “Massive MIMO”. There will be either 32 or 64 transmitters and 32 or 64 receivers inside that single antenna unit which gives it the name “Multiple In Multiple Out” or “MIMO”, and it is massive. 5G is being licensed by governments in higher radio frequencies than 2G, 3G, or 4G mobile networks. The higher the radio frequency, the smaller the individual antenna, which means we can pack more of them into a single antenna unit. 5G is most commonly being licensed in what’s called the c-band frequency but it’s also being licensed in a much higher mm-wave frequency, and these Massive-MIMO antennas will have hundreds of transmitters and receivers in a single unit. The more transmitters and receivers working simultaneously, the faster your Internet connection.
With so many antennas packed into a small space, we can use Artificial Intelligence (AI) to direct the radio waves to your specific location in a process called “beamforming”. Think of beamforming as a searchlight picking out exactly where you are. No longer will you have to wave your phone in the air to get a stronger signal. This AI will also manage the power of the radio beam so it just reaches your phone but goes no further to save electricity.
3. Fibre Optic
The third component is the physical network that links the antennas and transports data all around the country. You may hear it called “backhaul” or “transmission”, but here we’re going to call it the Transport Network. In the same way the Radio Access Network uses different radio frequencies to transmit data through the air, the Transport Network uses different light frequencies to transmit data over fibre-optic cable. There is no point using Massive-MIMO antennas in the Radio Access Network if you do not also increase capacity in the Transport Network behind it. We do this by increasing the number of light wavelengths we get down a single fibre-optic cable.
The latest 5G Transport Network will fit 120 different wavelengths down a single fibre, which is a huge amount of capacity. Optical signals travel faster than electrical signals but to route your data around the country these optical signals have historically been converted to electrical signals to work out where they needed to go. No longer. The latest routing technology works directly with the light signals so no conversion to electrical signals is needed meaning your data will race through the network at the speed of light. Buffering of Ultra-HD 4k video will never be experienced again.
4. The Core
Finally, we get to the fourth component: the Core Network where radio and light waves become recognisable as your data. The core network understands two sorts of information: Control Plane information that determines what’s going on in the network and User Plane information that’s yours. Don’t worry about the Control Plane. The User Plane is where the data becomes things you know about: web sites, streamed video, VoIP calls, instant messaging, or emails. The name “Core Network” is misleading. It doesn’t mean it sits at the centre of the physical network. In fact, the 5G Core Network is a hierarchy of physical data centres that we’re now calling the Telco Cloud. This Cloudified Core Network comprises racks and racks of computers and storage built in air-conditioned units everywhere from huge out-of-town facilities to the smallest telephone exchange at the “edge” of the network. It’s here, at the edge, that your User Plane functions like streamed video will sit ensuring the best possible response time for whatever it is you’re doing online. The cloudification of the core network for 5G will bring with it concepts like Mobile Edge Computing (MEC) and Network Slicing that will enable things like the live football augmented reality app and the remote driving of vehicles that we mentioned earlier.
The latest 5G Transport Network will fit 120 different wavelengths down a single fibre, which is a huge amount of capacity. Optical signals travel faster than electrical signals but to route your data around the country these optical signals have historically been converted to electrical signals to work out where they needed to go. No longer. The latest routing technology works directly with the light signals so no conversion to electrical signals is needed meaning your data will race through the network at the speed of light. Buffering of Ultra-HD 4k video will never be experienced again.
The 5 Keys to 5G
Speeds and feeds
Speeds of up to 20 Gbps will be achieved using a combination of innovations such as carrier aggregation (CA), massive multiple input multiple output (MIMO), and quadrature amplitude modulation (QAM).
Unlicensed spectrum
MNOs are increasingly using unlicensed spectrum in the 2.4 and 5 Gigahertz (GHz) frequency bands. 5G networks will need to tap into the vast amount of spectrum available in these unlicensed bands to offload traffic in heavily congested areas and provide connectivity for billions of IoT devices. Advancements in Wi-Fi, LTE in Unlicensed spectrum (LTE-U), License Assisted Access (LAA), and MulteFire, among others, provide better quality and regulated access to unlicensed spectrum.
Internet of Things (IoT)
IoT devices pose a diverse set of requirements and challenges for 5G networks. It’s only fair that IoT should likewise pose a diverse set of solutions as well! You learn about a few of these solutions — including NarrowBand IoT (NB-IoT), LTE Category M1 (LTE-M), Long Range (LoRa) and Sigfox.
Virtualization
Network functions virtualization (NFV) enables the massive scale and rapid elasticity that MNOs will require in their 5G networks. Virtualization enables a virtual evolved packet core (vEPC), centralized radio access network (C-RAN), mobile edge computing (MEC), and network slicing.
New Radio (NR)
Although the other 5G innovations introduced in this section all have strong starting points in LTE Advanced Pro, 5G NR is a true 5G native technology that has yet to be standardized. 5G NR addresses the need for a new radio access technology that will enable access speeds up to 20 Gbps.
Artificial Intelligence (AI)
What is Artificial Intelligence?
The concept of what defines AI has changed over time, but at the core there has always been the idea of building machines which are capable of thinking like humans.
After all, human beings have proven uniquely capable of interpreting the world around us and using the information we pick up to effect change. If we want to build machines to help us do this more efficiently, then it makes sense to use ourselves as a blueprint.
AI, then, can be thought of as simulating the capacity for abstract, creative, deductive thought – and particularly the ability to learn which this gives rise to – using the digital, binary logic of computers.
Research and development work in AI is split between two branches. One is labelled “applied AI” which uses these principles of simulating human thought to carry out one specific task. The other is known as “generalized AI” – which seeks to develop machine intelligences that can turn their hands to any task, much like a person.
Artificial Intelligence (AI) represents machine-based intelligence, typically manifest in "cognitive" functions that humans associate with other human minds. There are a range of different technologies involved in AI including Machine Learning, Natural Language Processing, Deep Learning, and more. Cognitive Computing involves self-learning systems that use data mining, pattern recognition, and natural language processing to mimic the way the human brain works.
AI is increasingly integrated in many areas including Internet search, entertainment, commerce applications, content optimization, and robotics. The long-term prospect for these technologies is that they will become embedded in many different other technologies and provide autonomous decision making on behalf of humans, both directly, and indirectly through many processes, products, and services. AI is anticipated to have an ever increasing role in ICT including both traditional telecommunications as well as many communications enabled applications and digital commerce.
AI is rapidly becoming integrated into many aspects of communication, applications, content, and commerce. One such area transformed by AI is Customer Relationship Management (CRM). AI enabled chatbots represent an advanced technology for automated CRM solutions. Existing User Interfaces (UI) do not scale very well. Chatbots represent a way for brands, businesses, and publishers to interact with users without requiring them to download an app, become familiar with a new UI, or configure and update regularly. Chatbots provide conversational interfaces supported by AI to provide automated, contextual communications.
AI is undergoing a transformation from silo implementations to a utility function across many industry verticals as a form of Artificial General Intelligence (AGI) capability. This capability is becoming embedded and/or associated with many applications, services, products, and solutions. Mind Commerce sees AI innovation in a variety of areas including personalized AI to both support and protect end-users. The Internet of Things (IoT) is a particularly important area for AI as a means for safeguarding assets, reducing fraud, and supporting analytics and automated decision making.
Another important industry solution for AI is Virtual Personal Assistants (VPA) applications, which use Autonomous Agents and Smart Machine technology to enable an Ambient User Experience for applications and services. VPA rely upon software that provides advice while interfacing in a human-like fashion. The emerging role of intelligent VPA encompasses answering questions in an advisory role and performing specific actions virtually on behalf of humans. The Internet of Things (IoT) intensifies this need as machines interact with other machines and humans autonomously.
What are the key developments in AI?
All of these advances have been made possible due to the focus on imitating human thought processes. The field of research which has been most fruitful in recent years is what has become known as “machine learning”. In fact, it’s become so integral to contemporary AI that the terms “artificial intelligence” and “machine learning” are sometimes used interchangeably.
However, this is an imprecise use of language, and the best way to think of it is that machine learning represents the current state-of-the-art in the wider field of AI. The foundation of machine learning is that rather than have to be taught to do everything step by step, machines, if they can be programmed to think like us, can learn to work by observing, classifying and learning from its mistakes, just like we do.
Perhaps the single biggest enabling factor has been the explosion of data which has been unleashed since mainstream society merged itself with the digital world. This availability of data – from things we share on social media to machine data generated by connected industrial machinery – means computers now have a universe of information available to them, to help them learn more efficiently and make better decisions.
What is the future of AI?
That depends on who you ask, and the answer will vary wildly!
Real fears that development of intelligence which equals or exceeds our own, but has the capacity to work at far higher speeds, could have negative implications for the future of humanity have been voiced, and not just by apocalyptic sci-fi such as The Matrix or The Terminator, but respected scientists like Stephen Hawking.
Even if robots don’t eradicate us or turn us into living batteries, a less dramatic but still nightmarish scenario is that automation of labour (mental as well as physical) will lead to profound societal change – perhaps for the better, or perhaps for the worse.
Data Analytics
What is Data Analytics?
Data analytics is the use of processes and technology, typically some sort of analytics software, to extract valuable insight out of datasets. This insight is then applied in a number of ways depending on the business, its industry, and unique requirements.
Data analytics is important because it helps businesses become data-driven, meaning decisions are supported through the use of data. Data analytics is also helping businesses to predict problems before they occur and map out possible solutions.
While more businesses turn to data analytics to identify gaps, there are still plenty of people who could use some clarification. That’s why we’re starting with the root of data analysis: discerning qualitative data from quantitative data.
The convergence of Cloud, Data Management, IoT Platforms and Solutions is enabling the next evolution of data analytics in which enterprise will realize significant tangible and intangible benefits from IoT data. The ability to sort data in a raw format, store it in different structural formats, and subsequently release it for further analytics, will be of paramount importance for all industry verticals. IoT Data as a Service (IoTDaaS) offers convenient and cost-effective solutions to enterprises of various sizes and domain. IoTDaaS constitutes retrieving, storing, and analyzing information and provides customers either of the three or integrated service packages depending on the budget and the requirement.
Every large corporation collects and maintains a huge amount of human-oriented data associated with its customers including their preferences, purchases, habits, and other personal information. As the Internet of Things (IoT) progresses, there will an increasingly large amount of unstructured machine data. The growing amount of human-oriented and machine generated data will drive substantial opportunities for AI support of unstructured data analytics solutions. Industrial IoT (IIoT) and Enterprise IoT deployments will generate a substantial amount of data, most of which will be of the unstructured variety, requiring next generation data analytics tools and techniques. Streaming data IoT business data is highly valuable when it can be put into context and processes in real-time as it will facilitate completely new product and service offerings.
Section 1: Qualitative and quantitative data
Data analytics is comprised of both qualitative and quantitative data. The makeup of these data types is important, considering it’s how it will be analyzed later on. Let’s start with qualitative data.
Understanding qualitative data
Qualitative data asks “why,” and consists of characteristics, attributes, labels, and other identifiers. Some examples of how qualitative data is generated include:
- Texts and documents
- Audio and video recordings
- Images and symbols
- Interview transcripts and focus groups
- Observations and notes
Qualitative data is descriptive and non-statistical, as opposed to quantitative data.
Understanding quantitative data
Quantitative data asks “how much” or “how many,” and consists of numbers and values. Some examples of how quantitative data is generated include:
- Tests
- Experiments
- Surveys
- Market research
- Metrics
Quantitative data is statistical, conclusive, and measurable, making it a more optimal candidate for data analysis.
With a grasp on the two types of data, it’s now time to see why data structures make such a difference as well.
Section 2: Structured and unstructured data
Next, onto structured and unstructured data. How data is structured will determine how it is collected and processed, and which methods will need to be used to extract insight. Let’s start with structured data.
Understanding structured data
Structured data is most often categorized as quantitative data. It is, as you may have guessed by its name, highly-structured and organized so it can be easily searched in relational databases. Think of spreadsheets and tables. Some examples of structured data include:
- Names and dates
- Home and email addresses
- Identification numbers
- Transactional information
Structured data is generally preferred for data analysis since it’s much easier for machines to digest, as opposed to unstructured data.
Understanding unstructured data
Unstructured data actually accounts for more than 80 percent of all data generated today. The downside to this is that unstructured data cannot be collected and processed using conventional tools and methods.
To harness unstructured data, more modern approaches like utilizing NoSQL databases or loading raw data into data lakes will need to be considered. Some examples of unstructured data include:
- Emails and SMS
- Audio and video files
- Social media
- Satellite and surveillance imagery
- Server and weblogs
Making sense of unstructured data isn’t an easy task, but for more predictive and proactive insights, more businesses are looking at ways to deconstruct it.
Section 3: The data analysis process
Now that we know the anatomy of data, it’s time to see the steps businesses have to take to analyze it. This is known as the data analysis process.
Step 1
The first step in this process is defining a need for analysis. Are sales dwindling? Are production costs soaring? Are customers satisfied with your product? These are questions that will need to be considered.
Step 2
Next, onto collecting data. A business will typically gather structured data from its internal sources, such as CRM software, ERP systems, marketing automation tools, and more. There are also many open data sources to gather external information. For example, accessing finance and economic datasets to locate any patterns or trends.
Step 3
After you have all the right data, it’s time to sort through and clean any duplicates, anomalous data, and other inconsistencies that could skew the analysis.
Step 4
Now for the analysis, and there are a number of ways to do so. For example, business intelligence software could generate charts and reports that are easily understood by decision-makers. One could also perform a variety of data mining techniques for deeper analysis. This step depends on the business’ requirements and resources.
Step 5
The final step is putting analysis into action. How one interprets the results of the analysis is crucial for resolving the business problem brought up in step one.
Data analysis may have a set of steps, but not every analysis shows the same picture, which brings us to the next topic.
Section 4: Types of data analytics
Not all analyses are created equal. Each has its level of complexity and depth of insight they reveal. Below are the four types of data analytics you’ll commonly hear about.
1. Descriptive analytics
Descriptive analytics is introductory, retrospective, and is the first step of identifying “what happened” regarding a business query. For example, this type of analysis may point toward declining website traffic or an uptick in social media engagement. Descriptive analytics is the most common type of business analytics today.
2. Diagnostic analytics
Diagnostic analytics is retrospective as well, although, it identifies “why” something may have occurred. It is a more in-depth, drilled down analytical approach and may apply data mining techniques to provide context to a business query.
3. Predictive analytics
Predictive analytics attempts to forecast what is likely to happen next based on historical data. This is a type of advanced analytics, utilizing data mining, machine learning, and predictive modeling.
The usefulness of predictive analytics software transcends many industries. Banks are using it for clearer fraud detection, manufacturers are using it for predictive maintenance, and retailers are using it to identify up-sell opportunities.
4. Prescriptive analytics
Prescriptive analytics is an analysis of extreme complexity, often requiring data scientists with prior knowledge of prescriptive models. Utilizing both historical data and external information, prescriptive analytics could provide calculated next steps a business should take to solve its query.
While every business would love to tap prescriptive analytics, the amount of resources needed is just not feasible for many. Although, there are some analytics trends we can expect to take shape soon.
Internet of Things (IoT)
We all know that IoT is changing industries across the board – from agriculture to healthcare to manufacturing and everything in between – but what is IoT, exactly? Working for an Internet of Things (IoT) company, I get asked that question all the time and, over that time, I’ve worked hard to boil it down to something anyone can understand. Here’s everything you need to know about the internet of things.
What is Internet of Things (IoT)?
How are you reading this post right now? It might be on desktop, on mobile, maybe a tablet, but whatever device you’re using, it’s most definitely connected to the internet.
An internet connection is a wonderful thing, it give us all sorts of benefits that just weren’t possible before. If you’re old enough, think of your cellphone before it was a smartphone. You could call and you could text sure, but now you can read any book, watch any movie, or listen to any song all in the palm of your hand. And that’s just to name a few of the incredible things your smartphone can do.
Connecting things to the internet yields many amazing benefits. We’ve all seen these benefits with our smartphones, laptops, and tablets, but this is true for everything else too. And yes, I do mean everything.
“IoT means taking all the things in the world and connecting them to the internet.”
IoT definition for beginners
I think that confusion arises not because the concept is so narrow and tightly defined, but rather because it’s so broad and loosely defined. It can be hard to nail down the concept in your head when there are so many examples and possibilities in IoT.
To help clarify, I think it’s important to understand the benefits of connecting things to the internet. Why would we even want to connect everything to the internet?
Why IoT Matters?
When something is connected to the internet, that means that it can send information or receive information, or both. This ability to send and/or receive information makes things smart, and smart is good.
Let’s use smartphones (smartphones) again as an example. Right now you can listen to just about any song in the world, but it’s not because your phone actually has every song in the world stored on it. It’s because every song in the world is stored somewhere else, but your phone can send information (asking for that song) and then receive information (streaming that song on your phone).
To be smart, a thing doesn’t need to have super storage or a supercomputer inside of it. All a thing has to do is connect to super storage or to a supercomputer. Being connected is awesome.
In the Internet of Things, all the things that are being connected to the internet can be put into three categories:
- Things that collect information and then send it.
- Things that receive information and then act on it.
- Things that do both.
And all three of these have enormous benefits that feed on each other.
1. Collecting and Sending Information
This means sensors. Sensors could be temperature sensors, motion sensors, moisture sensors, air quality sensors, light sensors, you name it. These sensors, along with a connection, allow us to automatically collect information from the environment which, in turn, allows us to make more intelligent decisions.
On the farm, automatically getting information about the soil moisture can tell farmers exactly when their crops need to be watered. Instead of watering too much (which can be an expensive over-use of irrigation systems and environmentally wasteful) or watering too little (which can be an expensive loss of crops), the farmer can ensure that crops get exactly the right amount of water. More money for farmers and more food for the world!
Just as our sight, hearing, smell, touch, and taste allow us, humans, to make sense of the world, sensors allow machines to make sense of the world.
2. Receiving and Acting on Information
We’re all very familiar with machines getting information and then acting. Your printer receives a document and it prints it. Your car receives a signal from your car keys and the doors open. The examples are endless.
Whether it’s a simple as sending the command “turn on” or as complex as sending a 3D model to a 3D printer, we know that we can tell machines what to do from far away. So what?
The real power of the Internet of Things arises when things can do both of the above. Things that collect information and send it, but also receive information and act on it.
3. Doing Both
Let’s quickly go back to the farming example. The sensors can collect information about the soil moisture to tell the farmer how much to water the crops, but you don’t actually need the farmer. Instead, the irrigation system can automatically turn on as needed, based on how much moisture is in the soil.
You can take it a step further too. If the irrigation system receives information about the weather from its internet connection, it can also know when it’s going to rain and decide not to water the crops today because they’ll be watered by the rain anyways.
And it doesn’t stop there! All this information about the soil moisture, how much the irrigation system is watering the crops, and how well the crops actually grow can be collected and sent to supercomputers that run amazing algorithms that can make sense of all this information.
And that’s just one kind of sensor. Add in other sensors like light, air quality, and temperature, and these algorithms can learn much much more. With dozens, hundreds, thousands of farms all collecting this information, these algorithms can create incredible insights into how to make crops grow the best, helping to feed the world’s growing population.
3 Categories of Internet of Things (IoT) Solutions
1. Consumer IoT
In terms of Consumer IoT, there are a few particularly important consumer-oriented markets including Connected Automobiles, Connected Homes, and personal electronics such as Wearable Technology.
Connected Automobiles refers to the use of IoT and broadband communications (LTE, WiFi, and soon 5G) technology in the car with the use of smartphones or other technologies typically manifest as handheld or wearable devices. Vehicles are at the forefront of a major convergence happening that includes a few key technologies: 5G, Artificial Intelligence, Data Management (Big Data, Analytics, Visualization, etc.), Cloud Technologies, and IoT.
Connected (e.g. Smart) Homes represent an Internet connected residences that provide an enhanced lifestyle for its occupants by way of home automation as well as enhanced information, entertainment, and safety services. The Connected Home ecosystem is rapidly expanding beyond merely Connected Entertainment (TV, Receiver, DVD Recorder, Media Player, Gaming Consoles) to include many areas such as Home and Office Equipment (Printer, VoIP Phone, etc.), Personal Consumer Electronics (Wireless IP Camera, Smartphone, Tablet, Portable Media Players, Navigation Devices, etc.), Energy Management (Temperature, Lighting, Heating and Air Conditioning), Safety, and Smart Consumer Appliances (Washing Machine, Refrigerator, etc.), and more.
Wearable technology is increasingly becoming an important medium for communication, infotainment services, health solution, textile, military, and industrial solutions. Wearables provide both a new user interface as well as a convenient and always available means of signaling, communications, and control via IoT.
This segment has the potential for massive transformation in many industries. Early adopter industries include clothing, healthcare, sports, and fitness. For example, wearable devices and digital healthcare represent two dominant trends that are poised to redefine virtually everything about how health products and services are delivered and supported. Ranging from telemedicine to self-monitoring and diagnosis, wearable devices and IoT will start as a novelty and achieve necessity status as insurance company cost optimization become the main driver for adoption and usage.
2. Enterprise IoT
Enterprise IoT is concerned with a variety of factors dealing with business operations efficiency and effectiveness. For example, one important area to consider is the transition from traditional Enterprise Resource Planning (ERP) to IoT enabled ERP, and the impact of IoT enabled ERP on enterprise as a whole. Leading ERP solution providers are adding IoT capabilities into ERP systems to generate meaningful insights for businesses. ERP systems are coupling sensors and other IoT devices to transmit data into ERP system on a real-time basis without human intervention.
In another example that cuts across the Consumer, Enterprise, and Industrial IoT markets, consumer appliances data is fed directly into manufacturers ERP system without using any middleman system. This expedites fault finding program and proactive maintenance using machine generated data. This type of consumer centric ERP process will be the new reality for enterprise ERP systems integrated with IoT solutions.
3. Industrial IoT
The industrial sector is rapidly integrating Internet of Things (IoT) with other key technologies such as 3D Printing, Big Data and Streaming Analytics. Typically referred to as the Industrial Internet of Things (IIoT) or simply the Industrial Internet, IoT in industry includes Connected Manufacturing in which the combination of certain key technologies are anticipated to substantially advance the Industry 4.0 revolution towards increasingly smarter manufacturing. In terms of core functionality for Connected Manufacturing, IIoT provides the basis for communications, control, and automated data capture. Data Analytics provides the means to process vast amounts of machine-generated and often unstructured data. Accordingly, Big Data technologies and predictive analytics enable streamlining of industrial processes. AI technology provides the means to further automate decision making and to engage machine learning for ongoing efficiency and effectiveness improvements.
IIoT is poised to transform many industry verticals such as Agriculture, Automotive, Healthcare, and more. Initially focusing on improving existing processes and augmented current infrastructure, IIoT will evolve to encompass next generation methods and procedures. For example, IoT in Agriculture (IoTAg) represents a more specific use of technology wherein agricultural planning and operations becomes connected in ways previously impossible if it were not for advances in sensors, communications, data analytics and other IoTAg areas. IoT in Healthcare is another promising example. The evolving area of Real-Time Remote Medical Diagnosis Systems promise to revolutionize the detection and prescriptive abilities of healthcare diagnostics as IoT technologies integrate with Electronic Healthcare Records systems.
Near Field Communication (NFC)
What is NFC?
NFC (near field communication) is what enables two devices to communicate wirelessly when they’re close together. NFC is actually a subset of something called RFID (radio-frequency identification), a technology that allows us to identify things through radio waves. RFID is nothing new—it’s been used for decades for things like scanning items in grocery stores and luggage on baggage claims, and tagging cattle.
NFC, which was introduced in the early 2000s, uses a specific RFID frequency (13.56MHz, to be exact) for close-range communications. To date, one of the more common uses for NFC is identification cards to gain access to places like office buildings and private garages. But increasingly, NFC is being used to power something called “contactless” payments.
NFC isn’t just useful on its own—it can also be used in conjunction with other cutting-edge technologies such as the Internet of Things (IoT). From smartphones to home automation, this article will discuss the ways in which NFC and IoT intersect.
NFC enables simplified transactions, data exchange, pairing, wireless connections, and convenience between two objects when in close proximity to one another (up to 10 cm apart). Because the communication is one-to-one and requires such close proximity, data privacy is more inherent than with other wireless approaches.
The benefits of NFC include easy connections, rapid transactions, and simple exchange of data. NFC serves as a complement to other popular wireless technologies such as Bluetooth, which has a wider range than NFC but which also consumes more power.
How NFC Works with IoT?
Have you ever wondered about the science behind tap-and-go technologies like Apple Pay and contactless credit cards? In many cases, these services are powered by a method of wirelessly transferring data called near-field communication (NFC).
The Internet of Things (IoT) is a massive network of billions of devices, from industrial sensors to self-driving cars, that are connected to the Internet in order to collect and exchange information. Tech market research company Juniper Research projects that by 2020, there will be 38.5 billion IoT-connected gadgets.
By enabling closer integration and communication between devices, the IoT is widely expected to shake up the ways that people live, work, and play. However, there are a few serious roadblocks that stand on the path to mainstream IoT adoption.
For example, how do IoT objects know what a user is intending to do? How can you develop IoT devices that are secure from external attacks? How can you connect unpowered objects to the IoT?
NFC solves many of the challenges associated with IoT:
1. With a straightforward tap-and-go mechanism, NFC makes it simple and intuitive to connect two different IoT devices.
2. Because NFC chips must be in close proximity of each other to initiate a transaction, NFC is a clear sign that the user intends to take a certain action. The short range of NFC also protects against unauthorized access by hackers.
3. NFC includes built-in features such as encryption that cut down on the potential for eavesdropping and other malicious activities.
4. Even objects without power or an IoT connection can passively exchange data via NFC tags. Users with an NFC-enabled device can tap the gadget to get information such as URLs.
For example, NFC technology can be used as a substitute for hotel key cards. By downloading your hotel reservation to a mobile app, the NFC chip in your smartphone becomes a key that can unlock your door. In addition, NFC technology can be integrated almost anywhere you might need cheap, battery-less electronic tags like in event tickets and animal tags for wildlife or livestock tracking.
Another major NFC use case for IoT is home automation. For example, introducing a new device onto your “smart home” network can be a laborious process that involves long passwords and complicated configurations.
You can skip this process by equipping your home with an NFC-enabled IoT “gateway” that serves as the nexus for all IoT applications. When you introduce a new device with an NFC tag, you can simply tap the device against the gateway to automatically connect it to your home network.
A second challenge for building a unified smart home is the use of different communications technologies, such as Wi-Fi and Bluetooth. NFC tags can bridge the gap between these technologies with a single tap, letting you do away with the time-consuming process of device discovery and pairing.
Why NFC is the critical link to IoT?
According to market research, soon more users will access the Internet wirelessly via mobile devices than from wired Ethernet connections. These mobile devices offer several different wireless connectivity options, each with their different strengths and capabilities. But only NFC is specifically designed and engineered to provide zero power operation and maximize privacy, both at a very low cost.
Privacy
NFC by design has a limited field of operation, which prevents data snooping that could occur from a distance. It also requires intent—the application of an NFC-enabled device to an NFC-enabled object—in order to read its memory. This approach is in contrast to protocols such as WiFi, which require radios to broadcast information regardless of intent. The limited field plus other features of the protocol help to ensure that data exchange only occurs with the intended party.
Low power
When communicating between an NFC reader and an NFC transponder (tag), energy harvested from the RF field of the reader powers the tag, enabling connectivity for Internet of Things (IoT) devices without using batteries or power. This energy harvesting feature enables a number of low-power and low-cost applications.
Low cost
Adding a connected NFC tag to an embedded system can establish connectivity to mobile devices at much lower cost than Bluetooth or WiFi approaches. In addition, eliminating the need for a battery in an embedded system can further lower an application’s overall bill of materials.
Comparing wireless protocols
Designers have several choices for connectivity, all with trade-offs (see Table below shows that WiFi, ZigBee, and Bluetooth all have different strengths and capabilities. None, however, were specifically defined and engineered to provide zero-power operation and maximize privacy, and do both at very low cost, as NFC does.
NFC Principles of Operation
NFC has three communication modes: Read/Write, Peer-to-Peer, and Card Emulation.
Read/Write mode
In Read/Write mode, an NFC reader/writer (or NFC-enabled mobile phone acting as a traditional contactless reader/writer) reads data from NFC-enabled smart objects and acts upon that information. With an NFC-enabled phone, for example, users can automatically connect to websites via a retrieved URL, send short message service (SMS) texts without typing, obtain coupons, etc., all with only a touch of their device to the object.
Peer-to-Peer mode
In Peer-to-Peer mode, any NFC-enabled reader/writer can communicate to another NFC reader/writer to exchange data with the same advantages of safety, security, intuitiveness, and simplicity inherent in Read/Write mode. In Peer-to-Peer mode, one of the reader/writers behaves as a tag, creating a communication link. For example, two devices (such as smartphones) with readers/writers can communicate with each other.
Card Emulation mode
An NFC device in Card Emulation mode can replace a contactless smartcard, enabling use of NFC enabled devices within the existing contactless card infrastructure for operations such as ticketing, access control, transit, tollgates, and contactless payments.
NFC Read/Write mode for embedded systems
Most embedded applications that utilize NFC will use Read/Write mode for the link. In these cases, an NFC-enabled device, such as a mobile device, will provide the active reader, and the tag will be in the embedded system.
Functionally, a connected NFC tag in an embedded system behaves similarly to a dual port memory. One of the memory ports is accessed wirelessly through an NFC interface. The other port is accessed by the embedded system.
Through this functionality, data can pass from an external source (e.g., an NFC-enabled mobile device) to the embedded system. Furthermore, because NFC connected tags are passive, they can be read from, or written to, by the external source even when the embedded system is powered off.
Because NFC connected tags function similarly to dual port memories, they facilitate any application that requires data transfer between an embedded system and an external system with an NFC reader/writer, such as an NFCenabled mobile device.
Blockchain Technology
What is Blockchain?
Blockchain is the name of a whole new technology. As the name states, it is a sequence of blocks or groups of transactions that are chained together and distributed among the users.
“The blockchain is an incorruptible digital ledger of economic transactions that can be programmed to record not just financial transactions but virtually everything of value.”
In the end, it works as an immutable record of transactions that do not require to rely on an external authority to validate the authenticity and integrity of the data. Transactions are typically economic, but we can store any kind of information in the blocks.
Even when we call it ‘new technology’, its origins are accepted to date from 1991 when Scott and Stornetta published “How to Time-Stamp a Digital Document” in the Journal of Cryptography. However, it is now when its popularity has increased thanks to the success of Bitcoin and other cryptocurrencies.
What is NOT Blockchain?
Before describing the Blockchain, we will start clarifying what is NOT Blockchain. Many people misunderstand the terms and concepts, leading to typical mistakes like the followings:
- Blockchain is NOT a cryptocurrency.
- Blockchain is NOT a programming language.
- Blockchain is NOT a cryptographic codification.
- Blockchain is NOT an IA or Machine Learning technology.
- Blockchain is NOT a Python library or framework.
How does it work?
The value of the Blockchain technology comes from the distributed security of the system. For this reason, there are several characteristics that are completely necessary for developing or using a Blockchain.
We describe the 5 key concepts that are the basis of the Blockchain technology as we know it up to the date:
1. Cryptographic Hash
2. Immutable Ledger
3. P2P Network
4. Consensus Protocol
5. Block Validation or ‘Mining’
CRYPTOGRAPHIC HASH
A Hash is a cryptographic function that transforms any input data into a fixed-length string of numbers. Every single input of the hash function will produce a different output, and the result is deterministic: if you use the same input, the output value will be always the same.
One of the most important features of the Hash functions is that the conversion is one-way: you cannot reverse the function to generate the original input.
There are many algorithms to create different Hash variations. For every input, the algorithm generates a completely different output, and it is not possible to predict how will the input changes affect the output.
The Blockchain nodes use Hash functions to create a unique identifier of any block of transactions. Every block includes the Hash value of the previous block.
IMMUTABLE LEDGER
This feature is tightly related to the previous one. Since every block of the chain contains the Hash of the previous one, it is not possible to modify any block without changing the entire chain. Hence, the chain works as an immutable digital ledger.
Let us see an example. We have the following chain, in which every block has been hashed and the hash is included in the following one:
If an anonymous attacker removes, adds or modifies any transaction in the first block, the HASH#1 will change:
HASH#1 is included as a part of the contents in Block 2. Because of that, HASH#2 will change too, and the error will propagate to every block of the chain after the block under attack. The user will then declare the chain invalid.
PEER-TO-PEER (P2P) NETWORK
The Blockchain does not need any external or internal trust authority. This is possible because the Blockchain data is distributed among all the users. Every user has its own copy of the transactions and hashed blocks, and they spread the information of any new transaction to the entire network. This way, it is not possible for anyone to alter the information in the chain since it is not stored by an individual entity but for an entire network of node users.
Once a block of transactions is validated, it is added to the chain and every user update their local information. Even if an attacker were to modify your local chain, the network will not accept any block from the altered blockchain.
CONSENSUS PROTOCOL
But what is the real blockchain? Users need to meet an agreement about the validity of the chain before adding more blocks.
Every time a node adds a new block, all the users have to validate the block by using a common protocol. Typically, the nodes reach a consensus about the correctness of a new block by Proof of Work or Proof of Stake methods.
The nodes check that the new block meets the requisites of their Proof method, including validation for all the transactions inside the block. If the block is valid, they consider it as a part of the Blockchain and keep adding new blocks.
In the case that different users have different chains apparently valid, they will discard the shorter one and select the longest chain as the main Blockchain. As in any Byzantine Fault Torelance (BFT) system, they will meet an agreement about the correct chain while at least 2/3 of the total nodes are not malicious.
BLOCK VALIDATION OR ‘MINING’
This feature is actually not completely necessary for a Blockchain, as we can see with examples like the CREDITS platform. However, is it probably one of the most famous facts about Blockchain thanks to the Bitcoin chain.
The term ‘mining’ refers to the act of meeting the Proof of Work requirements for adding a new block with pending transactions to the Blockchain. There are many different mining methods, as they are custom defined for the chain.
The PoW method usually requires the user to create a block with restrictions on its Hash code. Since the Hash code is unpredictable, the ‘miners’ have to test any possible combination before meeting the requirements. These restrictions define the difficulty of the network.
Once a ‘miner’ node finds the solution to the PoW problem, they add the block to the chain and every other node check the validity of the PoW according to their Consensus Protocol. If the block is legit, they will include it on their own local copies of the Blockchain.
Is There More Than One Type of Blockchain?
Public Blockchains
This is an open source software is used by everyone participating in the network. Anyone can join and the network has a global foundation. For example, a lot of cryptocurrencies are built on existing blockchains, ERC20 tokens being the most well-known example built on Ethereum.
Private Blockchains
These use the same principles as public ones except the software is proprietary and hosted on private servers instead. Companies such as WalMart are developing their own blockchains to track supply-chain logistics.
Why is blockchain important?
We are all now used to sharing information through a decentralized online platform: the internet. But when it comes to transferring value – e.g. money, ownership rights, intellectual property, etc. – we are usually forced to fall back on old-fashioned, centralized institutions or establishments like banks or government agencies. Even online payment methods which have sprung into existence since the birth of the internet – PayPal being the most obvious example – generally require integration with a bank account or credit card to be useful.
Blockchain technology offers the intriguing possibility of eliminating this “middleman”. It does this by filling three important roles – recording transactions, establishing identity and establishing contracts – traditionally carried out by the financial services sector.
This has huge implications because, worldwide, the financial services market is the largest sector of industry by market capitalization. Replacing even a fraction of this with a blockchain system would result in a huge disruption of the financial services industry, but also a massive increase in efficiencies.
The third role, establishing contracts, opens up a treasure trove of opportunities. Apart from a unit of value (like a bitcoin), blockchain can be used to store any kind of digital information, including computer code.
That snippet of code could be programmed to execute whenever certain parties enter their keys, thereby agreeing to a contract. The same code could read from external data feeds — stock prices, weather reports, news headlines, or anything that can be parsed by a computer, really — to create contracts that are automatically filed when certain conditions are met.
These are known as “smart contracts,” and the possibilities for their use are practically endless.
For example, your smart thermostat might communicate energy usage to a smart grid; when a certain number of wattage hours has been reached, another blockchain automatically transfers value from your account to the electric company, effectively automating the meter reader and the billing process.
Or, smart contracts might be put to use in the regulation of intellectual property, controlling how many times a user can access, share, or copy something. It could be used to create fraud-proof voting systems, censorship-resistant information distribution, and much more.
The point is that the potential uses for this technology are vast, and I predict that more and more industries will find ways to put it to good use in the very near future.
Augmented Reality (AR)
What is Augmented Reality?
Augmented reality is the technology that expands our physical world, adding layers of digital information onto it. Unlike Virtual Reality (VR), AR does not create the whole artificial environments to replace real with a virtual one. AR appears in direct view of an existing environment and adds sounds, videos, graphics to it.
“A view of the physical real-world environment with superimposed computer-generated images, thus changing the perception of reality, is the AR.”
The term itself was coined back in 1990, and one of the first commercial uses were in television and military. With the rise of the Internet and smartphones, AR rolled out its second wave and nowadays is mostly related to the interactive concept. 3D models are directly projected onto physical things or fused together in real-time, various augmented reality apps impact our habits, social life, and the entertainment industry.
AR apps typically connect digital animation to a special ‘marker’, or with the help of GPS in phones pinpoint the location. Augmentation is happening in real time and within the context of the environment, for example, overlaying scores to a live feed sport events.
How does Augmented Reality work?
What is Augmented Reality for many of us implies a technical side, i.e. how does AR work? For AR a certain range of data (images, animations, videos, 3D models) may be used and people will see the result in both natural and synthetic light. Also, users are aware of being in the real world which is advanced by computer vision, unlike in VR.
AR can be displayed on various devices: screens, glasses, handheld devices, mobile phones, head-mounted displays. It involves technologies like S.L.A.M. (simultaneous localization and mapping), depth tracking (briefly, a sensor data calculating the distance to the objects), and the following components:
Cameras and sensors
Collecting data about user’s interactions and sending it for processing. Cameras on devices are scanning the surroundings and with this info, a device locates physical objects and generates 3D models. It may be special duty cameras, like in Microsoft Hololens, or common smartphone cameras to take pictures/videos.
Processing
AR devices eventually should act like little computers, something modern smartphones already do. In the same manner, they require a CPU, a GPU, flash memory, RAM, Bluetooth/WiFi, a GPS, etc. to be able to measure speed, angle, direction, orientation in space, and so on.
Projection
This refers to a miniature projector on AR headsets, which takes data from sensors and projects digital content (result of processing) onto a surface to view. In fact, the use of projections in AR has not been fully invented yet to use it in commercial products or services.
Reflection
Some AR devices have mirrors to assist human eyes to view virtual images. Some have an “array of small curved mirrors” and some have a double-sided mirror to reflect light to a camera and to a user’s eye. The goal of such reflection paths is to perform a proper image alignment.
Types of Augmented Reality
1. Marker-based AR
Some also call it to image recognition, as it requires a special visual object and a camera to scan it. It may be anything, from a printed QR code to special signs. The AR device also calculates the position and orientation of a marker to position the content, in some cases. Thus, a marker initiates digital animations for users to view, and so images in a magazine may turn into 3D models.
2. Markerless AR
A.k.a. location-based or position-based augmented reality, that utilizes a GPS, a compass, a gyroscope, and an accelerometer to provide data based on user’s location. This data then determines what AR content you find or get in a certain area. With the availability of smartphones this type of AR typically produces maps and directions, nearby businesses info. Applications include events and information, business ads pop-ups, navigation support.
3. Projection-based AR
Projecting synthetic light to physical surfaces, and in some cases allows to interact with it. These are the holograms we have all seen in sci-fi movies like Star Wars. It detects user interaction with a projection by its alterations.
4. Superimposition-based AR
Replaces the original view with an augmented, fully or partially. Object recognition plays a key role, without it the whole concept is simply impossible. We’ve all seen the example of superimposed augmented reality in IKEA Catalog app, that allows users to place virtual items of their furniture catalog in their rooms.
Augmented reality devices
Many modern devices already support Augmented reality. From smartphones and tablets to gadgets like Google Glass or handheld devices, and these technologies continue to evolve. For processing and projection, AR devices and hardware, first of all, have requirements such as sensors, cameras, accelerometer, gyroscope, digital compass, GPS, CPU, displays, and things we’ve already mentioned.
Devices suitable for Augmented reality fall into the following categories:
- Mobile devices (smartphones and tablets)
The most available and best fit for AR mobile apps, ranging from pure gaming and entertainment to business analytics, sports, and social networking.
- Special AR devices
Designed primarily and solely for augmented reality experiences. One example is head-up displays (HUD), sending data to a transparent display directly into user’s view. Originally introduced to train military fighters pilots, now such devices have applications in aviation, automotive industry, manufacturing, sports, etc.
- AR glasses (or smart glasses)
Google Glasses, Meta 2 Glasses, Laster See-Thru, Laforge AR eyewear, etc. These units are capable of displaying notifications from your smartphone, assisting assembly line workers, access content hands-free, etc.
- AR contact lenses (or smart lenses)
Taking Augmented Reality one step even further. Manufacturers like Samsung and Sony have announced the development of AR lenses. Respectively, Samsung is working on lenses as the accessory to smartphones, while Sony is designing lenses as separate AR devices (with features like taking photos or storing data).
- Virtual retinal displays (VRD)
Creating images by projecting laser light into the human eye. Aiming at bright, high contrast and high-resolution images, such systems yet remain to be made for a practical use.
Virtual Reality (VR)
What is virtual reality?
Virtual reality (VR) means experiencing things through our computers that don't really exist. From that simple definition, the idea doesn't sound especially new. When you look at an amazing Canaletto painting, for example, you're experiencing the sites and sounds of Italy as it was about 250 years ago—so that's a kind of virtual reality. In the same way, if you listen to ambient instrumental or classical music with your eyes closed, and start dreaming about things, isn't that an example of virtual reality—an experience of a world that doesn't really exist? What about losing yourself in a book or a movie? Surely that's a kind of virtual reality?
If we're going to understand why books, movies, paintings, and pieces of music aren't the same thing as virtual reality, we need to define VR fairly clearly. For the purposes of this simple, introductory article, I'm going to define it as:
A believable, interactive 3D computer-created world that you can explore so you feel you really are there, both mentally and physically. Putting it another way, virtual reality is essentially:
1. Believable
You really need to feel like you're in your virtual world (on Mars, or wherever) and to keep believing that, or the illusion of virtual reality will disappear.
2. Interactive
As you move around, the VR world needs to move with you. You can watch a 3D movie and be transported up to the Moon or down to the seabed—but it's not interactive in any sense.
3. Computer-generated
Why is that important? Because only powerful machines, with realistic 3D computer graphics, are fast enough to make believable, interactive, alternative worlds that change in real-time as we move around them.
4. Explorable
A VR world needs to be big and detailed enough for you to explore. However realistic a painting is, it shows only one scene, from one perspective. A book can describe a vast and complex "virtual world," but you can only really explore it in a linear way, exactly as the author describes it.
5. Immersive
To be both believable and interactive, VR needs to engage both your body and your mind. Paintings by war artists can give us glimpses of conflict, but they can never fully convey the sight, sound, smell, taste, and feel of battle. You can play a flight simulator game on your home PC and be lost in a very realistic, interactive experience for hours (the landscape will constantly change as your plane flies through it), but it's not like using a real flight simulator (where you sit in a hydraulically operated mockup of a real cockpit and feel actual forces as it tips and tilts), and even less like flying a plane.
We can see from this why reading a book, looking at a painting, listening to a classical symphony, or watching a movie don't qualify as virtual reality. All of them offer partial glimpses of another reality, but none are interactive, explorable, or fully believable. If you're sitting in a movie theater looking at a giant picture of Mars on the screen, and you suddenly turn your head too far, you'll see and remember that you're actually on Earth and the illusion will disappear. If you see something interesting on the screen, you can't reach out and touch it or walk towards it; again, the illusion will simply disappear. So these forms of entertainment are essentially passive: however plausible they might be, they don't actively engage you in any way.
VR is quite different. It makes you think you are actually living inside a completely believable virtual world (one in which, to use the technical jargon, you are partly or fully immersed). It is two-way interactive: as you respond to what you see, what you see responds to you: if you turn your head around, what you see or hear in VR changes to match your new perspective.
Types of virtual reality
"Virtual reality" has often been used as a marketing buzzword for compelling, interactive video games or even 3D movies and television programs, none of which really count as VR because they don't immerse you either fully or partially in a virtual world. Search for "virtual reality" in your cellphone app store and you'll find hundreds of hits, even though a tiny cellphone screen could never get anywhere near producing the convincing experience of VR. Nevertheless, things like interactive games and computer simulations would certainly meet parts of our definition up above, so there's clearly more than one approach to building virtual worlds—and more than one flavor of virtual reality. Here are a few of the bigger variations:
1. Fully immersive
For the complete VR experience, we need three things. First, a plausible, and richly detailed virtual world to explore; a computer model or simulation, in other words. Second, a powerful computer that can detect what we're going and adjust our experience accordingly, in real time (so what we see or hear changes as fast as we move—just like in real reality). Third, hardware linked to the computer that fully immerses us in the virtual world as we roam around. Usually, we'd need to put on what's called a head-mounted display (HMD) with two screens and stereo sound, and wear one or more sensory gloves. Alternatively, we could move around inside a room, fitted out with surround-sound loudspeakers, onto which changing images are projected from outside. We'll explore VR equipment in more detail in a moment.
2. Non-immersive
A highly realistic flight simulator on a home PC might qualify as nonimmersive virtual reality, especially if it uses a very wide screen, with headphones or surround sound, and a realistic joystick and other controls. Not everyone wants or needs to be fully immersed in an alternative reality. An architect might build a detailed 3D model of a new building to show to clients that can be explored on a desktop computer by moving a mouse. Most people would classify that as a kind of virtual reality, even if it doesn't fully immerse you. In the same way, computer archaeologists often create engaging 3D reconstructions of long-lost settlements that you can move around and explore. They don't take you back hundreds or thousands of years or create the sounds, smells, and tastes of prehistory, but they give a much richer experience than a few pastel drawings or even an animated movie.
3. Collaborative
What about "virtual world" games like Second Life and Minecraft? Do they count as virtual reality? Although they meet the first four of our criteria (believable, interactive, computer-created and explorable), they don't really meet the fifth: they don't fully immerse you. But one thing they do offer that cutting-edge VR typically doesn't is collaboration: the idea of sharing an experience in a virtual world with other people, often in real time or something very close to it. Collaboration and sharing are likely to become increasingly important features of VR in future.
4. Web-based
Virtual reality was one of the hottest, fastest-growing technologies in the late 1980s and early 1990s, but the rapid rise of the World Wide Web largely killed off interest after that. Even though computer scientists developed a way of building virtual worlds on the Web (using a technology analogous to HTML called Virtual Reality Markup Language, VRML), ordinary people were much more interested in the way the Web gave them new ways to access real reality—new ways to find and publish information, shop, and share thoughts, ideas, and experiences with friends through social media. With Facebook's growing interest in the technology, the future of VR seems likely to be both Web-based and collaborative.
5. Augmented reality
Mobile devices like smartphones and tablets have put what used to be supercomputer power in our hands and pockets. If we're wandering round the world, maybe visiting a heritage site like the pyramids or a fascinating foreign city we've never been to before, what we want is typically not virtual reality but an enhanced experience of the exciting reality we can see in front of us. That's spawned the idea of augmented reality (AR), where, for example, you point your smartphone at a landmark or a striking building and interesting information about it pops up automatically. Augmented reality is all about connecting the real world we experience to the vast virtual world of information that we've collectively created on the Web. Neither of these worlds is virtual, but the idea of exploring and navigating the two simultaneously does, nevertheless, have things in common with virtual reality. For example, how can a mobile device figure out its precise location in the world? How do the things you see on the screen of your tablet change as you wander round a city? Technically, these problems are similar to the ones developers of VR systems have to solve—so there are close links between AR and VR.
What equipment do we need for virtual reality?
Close your eyes and think of virtual reality and you probably picture something like our top photo: a geek wearing a wraparound headset (HMD) and datagloves, wired into a powerful workstation or supercomputer. What differentiates VR from an ordinary computer experience (using your PC to write an essay or play games) is the nature of the input and output. Where an ordinary computer uses things like a keyboard, mouse, or (more exotically) speech recognition for input, VR uses sensors that detect how your body is moving. And where a PC displays output on a screen (or a printer), VR uses two screens (one for each eye), stereo or surround-sound speakers, and maybe some forms of haptic (touch and body perception) feedback as well. Let's take a quick tour through some of the more common VR input and output devices.
1. Head-mounted displays (HMDs)
There are two big differences between VR and looking at an ordinary computer screen: in VR, you see a 3D image that changes smoothly, in real-time, as you move your head. That's made possible by wearing a head-mounted display, which looks like a giant motorbike helmet or welding visor, but consists of two small screens (one in front of each eye), a blackout blindfold that blocks out all other light (eliminating distractions from the real world), and stereo headphones. The two screens display slightly different, stereoscopic images, creating a realistic 3D perspective of the virtual world. HMDs usually also have built-in accelerometers or position sensors so they can detect exactly how your head and body are moving (both position and orientation—which way they're tilting or pointing) and adjust the picture accordingly. The trouble with HMDs is that they're quite heavy, so they can be tiring to wear for long periods; some of the really heavy ones are even mounted on stands with counterweights. But HMDs don't have to be so elaborate and sophisticated: at the opposite end of the spectrum, Google has developed an affordable, low-cost pair of cardboard goggles with built-in lenses that convert an ordinary smartphone into a crude HMD.
2. Immersive rooms
An alternative to putting on an HMD is to sit or stand inside a room onto whose walls changing images are projected from outside. As you move in the room, the images change accordingly. Flight simulators use this technique, often with images of landscapes, cities, and airport approaches projected onto large screens positioned just outside a mockup of a cockpit. A famous 1990s VR experiment called CAVE (Cave Automatic Virtual Environment), developed at the University of Illinois by Thomas de Fanti, also worked this way. People moved around inside a large cube-shaped room with semi-transparent walls onto which stereo images were back-projected from outside. Although they didn't have to wear HMDs, they did need stereo glasses to experience full 3D perception.
3. Datagloves
See something amazing and your natural instinct is to reach out and touch it—even babies do that. So giving people the ability to handle virtual objects has always been a big part of VR. Usually, this is done using datagloves, which are ordinary gloves with sensors wired to the outside to detect hand and figure motions. One technical method of doing this uses fiber-optic cables stretched the length of each finger. Each cable has tiny cuts in it so, as you flex your fingers back and forth, more or less light escapes. A photocell at the end of the cable measures how much light reaches it and the computer uses this to figure out exactly what your fingers are doing. Other gloves use strain gauges, piezoelectric sensors, or electromechanical devices (such as potentiometers) to measure finger movements.
4. Wands
Even simpler than a dataglove, a wand is a stick you can use to touch, point to, or otherwise interact with a virtual world. It has position or motion sensors (such as accelerometers) built in, along with mouse-like buttons or scroll wheels. Originally, wands were clumsily wired into the main VR computer; increasingly, they're wireless.
Applications of virtual reality
VR has always suffered from the perception that it's little more than a glorified arcade game—literally a "dreamy escape" from reality. In that sense, "virtual reality" can be an unhelpful misnomer; "alternative reality," "artificial reality," or "computer simulation" might be better terms. The key thing to remember about VR is that it really isn't a fad or fantasy waiting in the wings to whistle people off to alternative worlds; it's a hard-edged practical technology that's been routinely used by scientists, doctors, dentists, engineers, architects, archaeologists, and the military for about the last 30 years. What sorts of things can we do with it?
1. Education
Difficult and dangerous jobs are hard to train for. How can you safely practice taking a trip to space, landing a jumbo jet, making a parachute jump, or carrying out brain surgery? All these things are obvious candidates for virtual reality applications. As we've seen already, flight cockpit simulators were among the earliest VR applications; they can trace their history back to mechanical simulators developed by Edwin Link in the 1920s. Just like pilots, surgeons are now routinely trained using VR. In a 2008 study of 735 surgical trainees from 28 different countries, 68 percent said the opportunity to train with VR was "good" or "excellent" for them and only 2 percent rated it useless or unsuitable.
2. Scientific visualization
Anything that happens at the atomic or molecular scale is effectively invisible unless you're prepared to sit with your eyes glued to an electron microscope. But suppose you want to design new materials or drugs and you want to experiment with the molecular equivalent of LEGO. That's another obvious application for virtual reality. Instead of wrestling with numbers, equations, or two-dimensional drawings of molecular structures, you can snap complex molecules together right before your eyes. This kind of work began in the 1960s at the University of North Carolina at Chapel Hill, where Frederick Brooks launched GROPE, a project to develop a VR system for exploring the interactions between protein molecules and drugs.
3. Medicine
Apart from its use in things like surgical training and drug design, virtual reality also makes possible telemedicine (monitoring, examining, or operating on patients remotely). A logical extension of this has a surgeon in one location hooked up to a virtual reality control panel and a robot in another location (maybe an entire continent away) wielding the knife. The best-known example of this is the daVinci surgical robot, released in 2009, of which several thousand have now been installed in hospitals worldwide. Introduce collaboration and there's the possibility of a whole group of the world's best surgeons working together on a particularly difficult operation—a kind of WikiSurgery, if you like!
Although it's still early days, VR has already been tested as a treatment for various kinds of psychiatric disorder (such as schizophrenia, agoraphobia, and phantom-limb pain), and in rehabilitation for stroke patients and those suffering degenerative diseases such as multiple sclerosis.
4. Industrial design and architecture
Architects used to build models out of card and paper; now they're much more likely to build virtual reality computer models you can walk through and explore. By the same token, it's generally much cheaper to design cars, airplanes, and other complex, expensive vehicles on a computer screen than to model them in wood, plastic, or other real-world materials. This is an area where virtual reality overlaps with computer modeling: instead of simply making an immersive 3D visual model for people to inspect and explore, you're creating a mathematical model that can be tested for its aerodynamic, safety, or other qualities.
5. Games and entertainment
IIoT is poised to transform many industry verticals such as Agriculture, Automotive, Healthcare, and more. Initially focusing on improving existing processes and augmented current infrastructure, IIoT will evolve to encompass next generation methods and procedures. For example, IoT in Agriculture (IoTAg) represents a more specific use of technology wherein agricultural planning and operations becomes connected in ways previously impossible if it were not for advances in sensors, communications, data analytics and other IoTAg areas. IoT in Healthcare is another promising example. The evolving area of Real-Time Remote Medical Diagnosis Systems promise to revolutionize the detection and prescriptive abilities of healthcare diagnostics as IoT technologies integrate with Electronic Healthcare Records systems.
Pros and cons of virtual reality
Like any technology, virtual reality has both good and bad points. How many of us would rather have a complex brain operation carried out by a surgeon trained in VR, compared to someone who has merely read books or watched over the shoulders of their peers? How many of us would rather practice our driving on a car simulator before we set foot on the road? Or sit back and relax in a Jumbo Jet, confident in the knowledge that our pilot practiced landing at this very airport, dozens of times, in a VR simulator before she ever set foot in a real cockpit?
Critics always raise the risk that people may be seduced by alternative realities to the point of neglecting their real-world lives—but that criticism has been leveled at everything from radio and TV to computer games and the Internet. And, at some point, it becomes a philosophical and ethical question: What is real anyway? And who is to say which is the better way to pass your time? Like many technologies, VR takes little or nothing away from the real world: you don't have to use it if you don't want to.
The promise of VR has loomed large over the world of computing for at least the last quarter century—but remains largely unfulfilled. While science, architecture, medicine, and the military all rely on VR technology in different ways, mainstream adoption remains virtually nonexistent; we're not routinely using VR the way we use computers, smartphones, or the Internet. But the 2014 acquisition of VR company Oculus, by Facebook, greatly renewed interest in the area and could change everything. Facebook's basic idea is to let people share things with their friends using the Internet and the Web. What if you could share not simply a photo or a link to a Web article but an entire experience? Instead of sharing photos of your wedding with your Facebook friends, what if you could make it possible for people to attend your wedding remotely, in virtual reality, in perpetuity? What if we could record historical events in such a way that people could experience them again and again, forever more? These are the sorts of social, collaborative virtual reality sharing that (we might guess) Facebook is thinking about exploring right now. If so, the future of virtual reality looks very bright indeed!
Robotic Process Automation (RPA)
What is Robotic Process Automation?
Robotic Process Automation is a software-based technology utilising software robots to emulate human execution of a business process. This means that it performs the task on a computer, uses the same interface a human worker would, clicks, types, opens applications, uses keyboard shortcuts, and more.
“software robots that mimic and integrate human actions within digital systems to optimize business processes. RPA captures data, runs applications, triggers responses, and communicates with other systems to perform a variety of tasks.”
Definition of Robotic process automation (RPA)
It is predominantly used to automate business processes and tasks, resulting in reductions in spending and giving businesses a competitive edge.
RPA is versatile and flexible enough to be used in a business of all sizes, from start-ups to enterprise organizations. Here is a rundown of the two common types available in the market:
1. Programmable bots
A programmable robot is defined by set rules and instructions. Parameters need to be defined by programmers before the bot can get to work. Ultimately, this involves mapping out a process – step-by-step – which can be very time consuming for more complex tasks.
2. Intelligent bots
Bots with artificial intelligence can analyse data – both historical and current – to learn how employees perform a process. The robot will follow your clicks, mouse movements and actions. After a period of time when enough data has been analysed, the bot will have enough data to complete the process itself. Intelligent and self-learning bots are better suited to perform processes involving unstructured data and processes that involve fluctuating parameters.
How does RPA work?
Automation technology has been a staple of business for the last decade, but in recent years, RPA technology has reached an impressive level of sophistication while retaining ease-of-use. It is no longer a tool that is solely used to facilitate the automation of simple and repetitive IT tasks. RPA is maturing, and with the convergence of other technologies – such as artificial intelligence and machine learning (ML) – we are beginning to explore new possibilities.
RPA compared to traditional process transformation approaches