You've probably heard the terms mobile computing, ubiquitous computing and pervasive computing. Are they the same or different? There are a few subtle differences.
Mobile computing is a generic term used to refer to the use of computing devices, which can be transported during normal usage, that allows people to access data and information from anywhere at any time. It involves the use of mobile communication, mobile hardware and mobile software. It was mobile computing that started the paradigm of 'anytime, anywhere' access to data and services.
Pervasive computing is the growing trend towards embedding microprocessors in every day object so that they can communicate information. Pervasive computing devices are completely connected and constantly available. It relies on the convergence of wireless technologies, advanced electronics and the Internet. Pervasive implies a technology that exists everywhere, with computing systems that are totally connected and consistently available.
Ubiquitous computing is a term coined by Mark Weiser to define a technology that "enhances computer use by making many computers available throughout the physical environment, while making them effectively invisible to the user". Ubiquitous computing can occur using any device, in any location, in any format. Weiser's idealistic view includes making it so intuitive to use that users don't require any skills to use computers and are not even aware that they're using one.
Often, the term 'pervasive computing' and 'ubiquitous computing' are used interchangeably. All three make use of wireless and wired communication infrastructure.
The next step in the evolution of the technology is Internet of Things (IoT). IoT is a collection of objects (e.g., sensors, computers, mobile devices, RFID tags etc.) with unique identifiers that are able to detect the presence of nearby objects and exchange data with one another using wireless and wired networks connected to the Internet in order to reach a common goal. Sensors and RFID are expected to be the enabling technology for IoT.
The definition of IoT varies depending on from which perspective you look at it – Internet-oriented, Things-oriented or semantic-oriented perspective. The Semantic-oriented IoT approach is based on the idea that the number of objects involved in the future Internet will be very high, making representing, storing, interconnecting, searching and organizing information generated by IoT to be very challenging.
IoT shifts the Internet from interconnected computers to interconnected things. It expands the "anywhere, anytime" paradigm to "anywhere, anytime, anything”.
Related links:
The Computer for the 21st Century
Some Computer Science Issues in Ubiquitous Computing
No comments:
Post a Comment