36
677 Chapter 23 Artificial Eye Vision Using Wireless Sensor Networks Shaimaa A. El-said Zagazig University Aboul Ella Hassanien Cairo University Contents 23.1 Introduction ....................................................................................................................678 23.2 Wireless Body Area Networks .........................................................................................679 23.2.1 Types of WBAN Devices .................................................................................... 682 23.2.1.1 Wireless Sensor Node ........................................................................... 682 23.2.1.2 Wireless Actuator Node ........................................................................ 682 23.2.1.3 Wireless Personal Device ...................................................................... 682 23.2.2 WBANs Data Rates ............................................................................................ 682 23.2.3 Differences between WSN and WBAN .............................................................. 682 23.2.4 WBAN System Prototypes ................................................................................. 685 23.2.4.1 Physiological Monitoring ...................................................................... 685 23.2.4.2 Motion and Activity Monitoring .......................................................... 686 23.2.4.3 Large-Scale Physiological and Behavioral Studies .................................. 691 23.3 Available Sensors ............................................................................................................. 691 23.4 Benefits of Using WBNA ................................................................................................692 23.5 Challenges in a WBAN...................................................................................................692 23.5.1 Power ...................................................................................................................693 23.5.2 Computation .......................................................................................................693 23.5.3 Security and Interference .....................................................................................693 23.5.4 Material Constraints ........................................................................................... 694 23.5.5 Mobility.............................................................................................................. 694 23.5.6 Robustness .......................................................................................................... 694 K15146_C023.indd 677 4/19/2013 6:15:53 AM

Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

  • Upload
    others

  • View
    9

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

677

Chapter 23

Artificial Eye Vision Using Wireless Sensor Networks

Shaimaa A. El-saidZagazig University

Aboul Ella HassanienCairo University

Contents23.1 Introduction ....................................................................................................................67823.2 Wireless Body Area Networks .........................................................................................679

23.2.1 Types of WBAN Devices .................................................................................... 68223.2.1.1 Wireless Sensor Node ........................................................................... 68223.2.1.2 Wireless Actuator Node ........................................................................ 68223.2.1.3 Wireless Personal Device ...................................................................... 682

23.2.2 WBANs Data Rates ............................................................................................ 68223.2.3 Differences between WSN and WBAN .............................................................. 68223.2.4 WBAN System Prototypes ................................................................................. 685

23.2.4.1 Physiological Monitoring ...................................................................... 68523.2.4.2 Motion and Activity Monitoring .......................................................... 68623.2.4.3 Large-Scale Physiological and Behavioral Studies ..................................691

23.3 Available Sensors .............................................................................................................69123.4 Benefits of Using WBNA ................................................................................................69223.5 Challenges in a WBAN ...................................................................................................692

23.5.1 Power ...................................................................................................................69323.5.2 Computation .......................................................................................................69323.5.3 Security and Interference .....................................................................................69323.5.4 Material Constraints ........................................................................................... 69423.5.5 Mobility .............................................................................................................. 69423.5.6 Robustness .......................................................................................................... 694

K15146_C023.indd 677 4/19/2013 6:15:53 AM

Page 2: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

678 ◾ Wireless Sensor Networks

23.1 IntroductionRecent advances in miniaturization and low-cost, low-power design have led to active research in large-scale, highly distributed systems of small, wireless, low-power, unattended sensors and actuators. Wireless sensor networks (WSNs) consist of spatially distributed wireless sensor nodes that cooperate with each other in order to monitor and collect data pertaining to physical or envi-ronmental conditions, such as temperature, pressure, motion, sound, and other phenomena. Their applications include tracking the movements of birds, small animals, and insects; monitoring environmental conditions that affect crops and livestock; macro instruments for large-scale Earth monitoring and planetary exploration; chemical or biological detection; precision agriculture; bio-logical, Earth, and environmental monitoring in marine, soil, and atmospheric contexts; forest fire detection; meteorological or geophysical research; flood detection; bio-complexity mapping of the environment; and pollution study [1, 2]. Some of the health applications for sensor net-works are providing interfaces for the disabled, integrated patient monitoring, diagnostics, drug administration in hospitals, monitoring the movements and internal processes of insects or other small animals, monitoring of human physiological data, and tracking and monitoring doctors and patients inside a hospital [3]. WSNs can be effectively used in health care to enhance the quality of life provided for the patients and also the quality of health care services [4]. For example, patients equipped with a wireless body area network (WBAN) need not be physically present with the physician for their diagnostics. A body sensor network proves to be adequate for emergency cases, in which it autonomously sends data about patient health so that the physician can prepare for the treatment immediately [5–10].

Embedding smart sensors in humans has been a research challenge resulting from the limita-tions imposed by these sensors from computational capabilities to limited power. These WSNs carry the promise of drastically improving and expanding the quality of care across a wide variety of settings and for different segments of the population. For example, early system prototypes have demonstrated the potential of WSNs to enable early detection of clinical deterioration through real-time patient monitoring in hospitals [11, 12], enhance first responders’ capability of provid-ing emergency care in large disasters through automatic electronic triage [13, 14], improve the life quality of the elderly through smart environments [15], and enable large-scale field studies of human behavior and chronic diseases [16, 17]. Tang et al. [18] have described their vision of a future in which one device will be able to build a WSN with a large number of nodes, both inside and outside the human body that may be either predefined or random, according to the

23.5.7 Continuous Operation ........................................................................................ 69423.5.8 Regulatory Requirements ................................................................................... 69423.5.9 Sensitivity of Sensors ............................................................................................695

23.6 Artificial Retina ..............................................................................................................69523.6.1 Retinal Diseases: AMD and RP ..........................................................................69523.6.2 Architecture of the Human Retina ..................................................................... 69723.6.3 How Artificial Retina Works .............................................................................. 69823.6.4 Progression of Disease States ............................................................................... 69923.6.5 Advancements in Using WBAN in Artificial Retina ........................................... 699

23.7 Conclusion ......................................................................................................................70523.8 Future Directions ........................................................................................................... 706References ................................................................................................................................707Related Web Sites for More Information ..................................................................................712

K15146_C023.indd 678 4/19/2013 6:15:53 AM

Page 3: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 679

application. This vision can only be applied through the use of common communication protocols for WSNs. The standardized hardware and software architectures can support compatible devices, which are expected to significantly affect the next generation of health care systems. Some of these devices can then be incorporated into the WBAN, providing new opportunities for technology to monitor health status [19].

The remainder of this chapter is organized as follows. Section 23.2 provides a brief description of WBANs and overviews the main differences between WSNs and WBANs. Section 23.3 gives a concise summary of some of the commercially available wireless sensor nodes for health monitor-ing. The benefits of using WBAN are outlined in Section 23.4. Section 23.5 presents some issues and challenges that face the usage of WSN in health applications. Section 23.6 provides a survey of WBAN application in vision rehabilitation. The architecture of the human retina, the work of artificial retinas, and the progression of disease states are described in detail. The chapter is con-cluded in Section 23.7. Future works are presented in Section 23.8.

23.2 Wireless Body Area NetworksRecent advances in electronics have enabled the development of small and intelligent (bio-) medi-cal sensors, which can be worn on or implanted in the human body. The use of a wireless interface with these sensors enables an easier application and is more cost effective and makes the health care mobile; this is referred to as mHealth. In order to fully exploit the benefits of wireless tech-nologies in telemedicine and mHealth, a new type of wireless network emerges: a wireless on-body network or a WBAN. Human health monitoring [20, 21] is emerging as a prominent application of the embedded sensor networks. A number of tiny wireless sensors are strategically placed on/in a patient’s body, to create a WBAN [22]. A WBAN can monitor vital signs, providing real-time feedback to allow many patient diagnostic procedures using continuous monitoring of chronic conditions or progress in recovery from an illness. An example of a medical WBAN used for patient monitoring is shown in Figure 23.1. Several sensors are placed in clothes, directly on the body, or under the skin of a person and measure the temperature, blood pressure, heart rate (HR), ECG, EEG, respiration rate, SpO2 levels, etc. Next to sensing devices, the patient has actuators, which act as drug-delivery systems. The medicine can be delivered at predetermined moments, triggered by an external source (i.e., a doctor who analyzes the data), or immediately when a sensor notices a problem. One example is the monitoring of the glucose level in the blood of diabetics. If the sensor monitors a sudden drop of glucose, a signal can be sent to the actuator in order to start the injection of insulin. Consequently, the patient will experience fewer nuisances from his disease. Another example of an actuator is a spinal cord stimulator implanted in the body for long-term pain relief [23].

Recent technological advances in wireless networking promise a new generation of WSNs suit-able for those on-body/in-body networked systems. Data acquisition across such sensors can be point-to-point or multipoint-to-point, depending on specific applications. While the distributed detection of an athlete’s posture [24] would need point-to-point data sharing across various on-body sensors, applications such as monitoring vital signs as shown in Figure 23.2 will require all body mounted and/or implanted sensors [22, 25] to route data multipoint-to-point to a sink node, which, in turn, can relay the information wirelessly to an out-of-body server. Figure 23.3 shows the environment of WBANs.

WBANs enable dense spatiotemporal sampling of physical, physiological, psychological, cog-nitive, and behavioral processes in spaces ranging from personal to buildings to even larger-scale

K15146_C023.indd 679 4/19/2013 6:15:53 AM

Page 4: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

680 ◾ Wireless Sensor Networks

Wire

Sensor/electrodes

Central controlunit

CCU

LocalPC

RemotePC

Sensor nodeelectronics with

wireless capability

(b)

(a)

Wireless chip:bluetooth802.15.4 (Zig Bee)WLAN

Wireless distanceReceiving

device

EEG

ECG

1st wireless distance

2nd wireless distancefor long rangetransmission

ReceivingdevicePH

CCU

WBAN

Figure 23.2 Typical WBAN system for detecting and transmitting signals from a human body: (a) current application of health care sensor network and (b) future application of health care sensor network targeted by WBAN. (From A. Darwish and A. E. Hassanien, Sensors, 11, 5561–5595, 2011, doi: 10.3390/s110605561.)

EEG

Positioning

Insulininjection

Glucose

Lactic acid

Artificialknee

Artificialknee

Personal device

Blood oxygen

Blood pumpECG

Motion sensor

Hearing aidcochlear implant

Pressure sensor

Figure 23.1 Example of patient monitoring in a WBAN. (From E. Krames, Best Pract. Res. Clin. Anaesthesiol., 16, 4, 619–649, December 2002.)

K15146_C023.indd 680 4/19/2013 6:15:54 AM

Page 5: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 681

ones. Such dense sampling across spaces of different scales results in sensory information–based health care applications. Current health care applications of WSNs target heart problems [26–28], asthma [7, 29, 30], emergency response [31], and stress monitoring [32]. This wireless biomedical sensor technology can also be effectively used to treat diabetes by providing a more consistent, accurate, and less invasive method for monitoring glucose levels. Currently, to monitor blood glucose levels, a lancet is used to prick a finger and a drop of blood is placed on a test strip, which is analyzed either manually or electronically. This constant pricking several times a day over a period of years can damage the tissue and blood vessels in that area. The sensor would monitor the glucose levels [33] and transmit the results to a wristwatch display. Wireless biomedical sensors can be implanted in the patient, and the glucose level can be monitored continuously [35]. Wireless biomedical sensors may play a key role in early detection of cancer [34]. Cancer cells exude nitric oxide, which affects the blood flow in the area surrounding a tumor. A sensor with the ability to detect these changes in the blood flow can be placed in suspect locations. It is likely that any abnormalities could be detected much sooner with the sensors than without. Video and various kinds of embedded sensors can be used to track and monitor the patient in their everyday activi-ties. This information can be processed and relayed to medical personnel. A patient’s routine can be assembled over the period of time, and deviations from this may be recognized and analyzed. Following is a brief review for using WBNA in medical applications.

Information

Blood pressuresensor node

Body areaaggregatorBiofeedback

Pulse oximetrysensor node

EMGsensor node

Network Healthcareserver

Caregiveror

physician

Emergency servicesor

medical researcher

Assessment, assistance, treatment

Inertialsensor node

Artificialpancreas

Figure 23.3 WBAN and its environment. (From A. Darwish and A. E. Hassanien, Sensors, 11, 5561–5595, 2011, doi: 10.3390/s110605561.)

K15146_C023.indd 681 4/19/2013 6:15:55 AM

Page 6: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

682 ◾ Wireless Sensor Networks

23.2.1 Types of WBAN Devices

23.2.1.1 Wireless Sensor Node

A device that responds to and gathers data on physical stimuli processes the data if necessary and reports this information wirelessly. It consists of several components: sensor hardware, a power unit, a processor, memory, and a transmitter or transceiver [7].

23.2.1.2 Wireless Actuator Node

A device that acts according to data received from the sensors or through interaction with the user. The components of an actuator are similar to the sensor’s: actuator hardware (e.g., hardware for medicine administration, including a reservoir to hold the medicine), a power unit, a processor, memory, and a receiver or transceiver.

23.2.1.3 Wireless Personal Device

A device that gathers all the information acquired by the sensors and actuators and informs the user (i.e., the patient, a nurse, a GP, etc.) via an external gateway, an actuator, or a display/LEDS on the device. The components are a power unit, a (large) processor, memory, and a transceiver. This device is also called a body control unit (BCU) [11], a body gateway, or a sink. In some imple-mentations, a personal digital assistant (PDA) or smart phone is used.

23.2.2 WBANs Data RatesBecause of the strong heterogeneity of the applications, data rates will vary strongly, ranging from simple data at a few kilobits per second to video streams of several megabits per second. Data can also be sent in bursts, which means that it is sent at a higher rate during the bursts. The data rates for the different applications are given in Table 23.1 and are calculated by means of the sampling rate, the range, and the desired accuracy of the measurements [38, 39]. Overall, it can be seen that the application data rates are not high. However, if one has a WBAN with several of these devices (i.e., a dozen motion sensors, ECG, EMG, glucose monitoring, etc.), the aggregated data rate easily reaches a few megabits per second, which is higher than the raw bit rate of most existing low-power radios.

23.2.3 Differences between WSN and WBANUnlike conventional WSNs, WBANs are smaller and consist of fewer nodes, less space covered, and fewer opportunities for redundancy [41]. These devices provide continuous health monitoring and real-time feedback to the user or medical personnel. Furthermore, the measurements can be recorded over a longer period of time, improving the quality of the measured data. In WBANs there are two types of devices that can be distinguished: sensors and actuators. The sensors are used to measure certain parameters of the human body, either externally or internally. Examples include measuring the heart beat or body temperature or recording a prolonged electrocardiogram (ECG). The actuators, on the other hand, take some specific actions according to the data they receive from the sensors or through interaction with the user. For example, an actuator equipped with a built-in reservoir and pump administers the correct dose of insulin to give to diabetics

K15146_C023.indd 682 4/19/2013 6:15:55 AM

Page 7: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 683

based on the glucose level measurements. Interaction with the user or other persons is usually handled by a personal device, for example, a PDA or a smart phone, which acts as a sink for data of the wireless devices. The following illustrates the differences between a WSN and a WBAN [42]:

◾ The devices used have limited energy resources available as they have a very small form factor (often less than 1 cm3). Furthermore, for most devices, it is impossible to recharge or change the batteries although a long lifetime of the device is wanted (up to several years or even decades for implanted devices). Hence, the energy resources and, consequently, the computational power and available memory of such devices will be limited.

◾ All devices are equally important, and devices are only added when they are needed for an application (i.e., no redundant devices are available).

◾ An extremely low transmit power per node is needed to minimize interference and to cope with health concerns.

◾ The propagation of the waves takes place in or on a (very) lossy medium, the human body. As a result, the waves are attenuated considerably before they reach the receiver.

◾ The devices are located on the human body that can be in motion. WBANs should therefore be robust against frequent changes in the network topology.

◾ The data mostly consists of medical information. Hence, high reliability and low delay is required.

− Stringent security mechanisms are required in order to ensure the strictly private and confidential character of the medical data.

An overview of some of these differences is given in Table 23.2.

Table 23.1 Examples of Medical WBAN Applications [37–40]

Application Date Rate Bandwith Accuracy

ECG (12 leads) 288 kbps 100–1000 Hz 12 bits

ECG (6 leads) 71 kbps 100–500 Hz 12 bits

EMG 320 kbps 0–10,000 Hz 16 bits

EEG (12 leads) 43.2 kbps 0–150 Hz 12 bits

Blood saturation 16 bps 0–1 Hz 8 bits

Glucose monitoring 1600 bps 0–50 Hz 16 bits

Temperature 120 bps 0–1 Hz 8 bits

Motion sensor 35 kbps 0–500 Hz 12 bits

Cochlear implant 100 kbps – –

Artificial retina 50–700 kbps – –

Audio 1 Mbps – –

Voice 50–100 kbps – –

K15146_C023.indd 683 4/19/2013 6:15:55 AM

Page 8: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

684 ◾ Wireless Sensor Networks

Table 23.2 Schematic Overview of Differences between WSNs and Wireless Body Area Networks

Challenges Wireless Sensor Network Wireless Body Area Network

Scale Monitored environment (meters/kilometers)

Human body (centimeters/meters)

Node Number Many redundant nodes for wide area coverage

Fewer, limited in space

Result Accuracy Through node redundancy Through node accuracy and robustness

Node Tasks Node performs a dedicated task Node performs multiple tasks

Node Size Small is preferred, but not important

Small is essential

Network Topology Very likely to be fixed or static More variable because of body movement

Data Rates Most often homogeneous Most often heterogeneous

Node Replacement Performed easily, nodes even disposable

Replacement of implanted nodes difficult

Node Lifetime Several years/months Several years/months, smaller battery capacity

Power Supply Accessible and likely to be replaced more easily and frequently

Inaccessible and difficult to replaced in an implantable setting

Power Demand Likely to be large, energy supply easier

Likely to be lower, energy supply more difficult

Energy-Scavenging Source

Most likely solar and wind power Most likely motion (vibration) and thermal (body heat)

Biocompatibility Not a consideration in most applications

A must for implants and some external sensors

Security Level Lower Higher, to protect patient information

Impact of Data Loss Likely to be compensated by redundant nodes

More significant, may require additional measures to ensure QoS and real-time data delivery

Wireless Technology Bluetooth, ZigBee, GPRS, WLAN Low power technology required

Source: With kind permission from Springer Science+Business Media: Body Sensor Networks, 2006, G.-Z. Yang, Ed.

K15146_C023.indd 684 4/19/2013 6:15:55 AM

Page 9: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 685

23.2.4 WBAN System PrototypesIn this subsection, we present several wireless sensing system prototypes developed and deployed to evaluate the efficacy of WBANs in some of the health care applications described before. While wireless health care systems using various wireless technologies exist, this work focuses on systems based on low-power wireless platforms for physiological and motion-monitoring studies and smart phone–based, large-scale studies.

23.2.4.1. Physiological Monitoring

In physiological-monitoring applications [70], low-power sensors measure and report a person’s vital signs (e.g., pulse oximetry, respiration rate, and temperature). These applications can be devel-oped and deployed in different contexts, ranging from disaster response to in-hospital patient monitoring and long-term remote monitoring for the elderly. Systems that automate patient moni-toring have the potential to increase the quality of care, both in disaster scenes and clinical envi-ronments. Systems such as CodeBlue [44], MEDiSN [45], and the Washington University’s vital sign monitoring system [46] target these application scenarios. Specifically, CodeBlue [44] aims to improve the triage process during disaster events with the help of WSNs comprising motes with IEEE 802.15.4 radios. The CodeBlue project integrated various medical sensors [e.g., EKG, SpO2, pulse rate, electromyography (EMG)] with mote-class devices and proposed a publish/subscribe-based network architecture that also supports priorities and remote sensor control [47]. Finally, victims with CodeBlue monitors can be tracked and localized using RF-based localiza-tion techniques [48]. Figure 23.4 shows the use of WBNA to develop the CodeBlue system. This project includes pre-hospital care and in-hospital emergency care, disaster response, and stroke patient rehabilitation. Research from this project has the potential for resuscitative care, real-time triage decisions, and long-term patient observations [49]. The system integrates low-power wireless wearable vital sign sensors, handheld computers, and location-tracking tags.

Ko et al. proposed MEDiSN (see Figure 23.5) to address similar goals as CodeBlue (e.g., improve the monitoring process of hospital patients and disaster victims as well as first responders) but using a different network architecture [45]. Specifically, unlike the ad hoc network used in CodeBlue, MEDiSN employs a wireless backbone network of easily deployable relay points (RPs). RPs are positioned at fixed locations, and they self-organize into a forest rooted at one or more gateways (i.e., PC-class devices that connect to the Internet) using a variant of the collection tree protocol (CTP) [50] tailored to high data rates. Motes that collect vital signs, known as miTags, associate with RPs to send their measurements to the gateway.

Figure 23.6 shows the UbiMon system (Ubiquitous Monitoring Environment for Wearable and Implantable Sensors) that aims to provide a continuous and unobtrusive monitoring system for patients in order to capture transient events [51]. As in Figure 23.6a through c, a compact flash WBAN card is developed for PDAs, in which sensor signals can be gathered, displayed, and ana-lyzed by the PDA as shown in Figure 23.6d.

In [52], researchers use WBAN to develop the eWatch as shown in Figure 23.7. It is a wear-able sensing, notification, and computing platform built into a wristwatch. eWatch can be used for applications such as context awareness notification, elderly monitoring, and fall detection. The eWatch system can sense if the user is in trouble and then query to confirm that it is an emergency. If the user does not respond, then the eWatch can use its network abilities to call for help. The eWatch can also notify a patient when he takes certain medication.

K15146_C023.indd 685 4/19/2013 6:15:55 AM

Page 10: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

686 ◾ Wireless Sensor Networks

23.2.4.2 Motion and Activity Monitoring

Another application domain for WSNs in health care is high-resolution monitoring of move-ment and activity levels. Wearable sensors can measure limb movements, posture, and muscular activity and can be applied to a range of clinical settings, including gait analysis [53–55], activ-ity classification [56, 57], athletic performance [58, 59], and neuromotor disease rehabilitation [60, 61]. In a typical scenario, a patient wears up to eight sensors (one on each limb segment) equipped with MEMS accelerometers and gyroscopes. A base station, such as a PC-class device in the patient’s home, collects data from the network. Data analysis can be performed to recover the patient’s motor coordination and activity level, which is, in turn, used to measure the effect of treatments. In contrast to physiological monitoring, motion analysis involves multiple sensors on a single patient, each measuring high-resolution signals, typically six channels per sensor, sampled at 100 Hz each. This volume of sensor data precludes real-time transmission, especially over multi-hop paths because of both bandwidth and energy constraints. In such studies, the size and weight of the wearable sensors must be minimized to avoid encumbering the patient’s movement. The SHIMMER sensor platform shown in Figure 23.8 measures 44.5 × 20 × 13 mm and weighs just 10 g, making it well suited for long-term wearable use.

Figure 23.5 Medical information tag, or miTag for short, used in MEDiSN. (From J. Ko et al., ACM Trans. Embedded Comput. Syst., 10, 1, article 11, 11:1–11:29, 2010.)

Antenna

Mica2 mote

Finger sensor

Figure 23.4 Wireless pulse oximeter sensor.

K15146_C023.indd 686 4/19/2013 6:15:56 AM

Page 11: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 687

In [62], SATIRE is designed to identify a user’s activity based on accelerometers and global positioning system (GPS) sensors integrated into Bsmart attire such as a winter jacket. SATIRE nodes, which are based on the MICAz [63] platform, measure accelerometer data and log it to local flash. These data are opportunistically transmitted using the low-power radio when the SHIMMER node is within communication range with the base station. Once the data are col-lected at the base station, the collected data are processed offline to characterize the user’s activity patterns, such as walking, sitting, or typing. Sensor nodes perform aggressive duty cycling to

(b)(a)

(c) (d)

Figure 23.6 (a) Wireless 3-leads ECG sensor, (b) ECG strap (center), (c) SpO2 sensor, and (d) PDA base station.

Figure 23.7 eWatch computing.

K15146_C023.indd 687 4/19/2013 6:15:56 AM

Page 12: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

688 ◾ Wireless Sensor Networks

reduce power consumption, extending lifetimes from several days to several weeks. While the Mercury system [60] permits long-term studies of a patient’s motor activity for neuromotor dis-ease studies, including Parkinson’s disease, stroke, and epilepsy. Energy is far more constrained in Mercury than in SATIRE because of the use of lightweight sensor nodes with small batteries. Mercury builds upon SATIRE’s approach to energy management and integrates several energy-aware adaptations, including dynamic sensor duty cycling, priority-driven data transmissions, and on-board feature extraction. Mercury is being used in several studies of Parkinson’s and epilepsy patients.

Figure 23.9 shows the pants for hip patient rehabilitation using WBAN. The HipGuard sys-tem is developed for patients who are recovering from hip surgery. This system monitors the patient’s leg and hip position and rotation with embedded wireless sensors. Alarm signals can be sent to patient’s wrist unit if hip or leg positions or rotations are false, and hence the HipGuard system can provide useful real-time information for the patient rehabilitation process [64].

Researchers use WBNA and wireless telephony technology in the Mobihealth project that uses GPRS/UMTS wireless communication technology for transferring data to create a generic plat-form for home health care. MobiHealth aims to provide continuous monitoring to patients outside the hospital environment [65]. It targets improving the quality of life of patients by enabling new value-added services in the areas of disease prevention, disease diagnosis, remote assistance, physi-cal state monitoring, and even in clinical research [66]. Figure 23.10 shows a Mobihealth system.

Figure 23.11 shows using WBNA in the LifeShirt project. It is a comfortable and completely noninvasive “smart garment” that gathers data during a patient’s daily routine, providing the most complete remote picture of a patient’s health status. It enables health care professionals and researchers to accurately monitor more than 30 vital life-sign functions in the real-world settings where patients live and work [42].

WBNA can also be applied in the Vital Sign Monitoring System as shown in Figure 23.12. In [67] a real-time patient monitoring system was designed and developed to integrate vital sign sen-sors, location sensors, ad hoc networking, electronic patient records, and web portal technology to allow remote monitoring of patient status.

Figure 23.8 SHIMMER wearable sensor platform.

K15146_C023.indd 688 4/19/2013 6:15:57 AM

Page 13: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 689

Mobile phone

Internet

GSM

BT

Central processing unit

ANT

ANT

ANT

Wrist unit Suunto T6

Sensornetwork

Footbed w. sensor

Figure 23.9 HipGuard Pants for hip patient rehabilitation.

ECG

BP/pulse

Headset

EEG

Services

Blood glucose

Figure 23.10 MobiHealth system, monitoring a patient outside the hospital environment.

K15146_C023.indd 689 4/19/2013 6:15:58 AM

Page 14: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

690 ◾ Wireless Sensor Networks

Chest respiratorymovement

Abdominalrespiratorymovement

ECG leads

Posture sensor

Other sensors

Lifeshirtrecorderrecordsall data

=Inductiveplethysmographsensorband

Figure 23.11 Smart LifeShirt.

802.15.4

EVDO

EVDO

Internet

Figure 23.12 Vital Sign Monitoring System. (From T. Gao et al., Vital signs monitoring and patient tracking over a wireless network, In Proceedings of the 27th Annual International Conference of the IEEE EMBS, Shanghai, China, pp. 102–105, 1–4 September 2005.)

K15146_C023.indd 690 4/19/2013 6:15:59 AM

Page 15: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 691

23.2.4.3 Large-Scale Physiological and Behavioral Studies

The third WBAN applications category in health care that was discussed is their use in conduct-ing large-scale physiological and behavioral studies [70]. The confluence of body-area networks of miniature wireless sensors, always-connected sensor equipped smart phones, and cloud-based data storage and processing services is leading to a new paradigm in population-scale medical research studies, particularly with ailments whose causes and manifestations relate to human behavior and living environments. The combination of body-area WSNs, smart phones, and cloud services permits physical, physiological, behavioral, social, and environmental data to be collected from human subjects in their natural environments continually, in real time, unattended, and in an unobtrusive fashion over long periods. Typically, data are collected from wireless sensors worn by subjects, wireless medical instruments, and sensors embedded in devices such as smart phones.

AutoSense systems [68] collects objective measurements of personal exposure to psychosocial stress and alcohol in the study participants’ natural environments. A field-deployable suite of wire-less sensors form a body-area wireless network and measure HR, HR variability, respiration rate, skin conductance, skin temperature, arterial blood pressure, and blood alcohol concentration. From these sensor readings, which, after initial validation and cleansing at the sensor, are sent to a smart phone, features of interest indicating onset of psychosocial stress and occurrence of alco-holism are computed in real time. The collected information is then disseminated to researchers answering behavioral research questions about stress, addiction, and the relationship between the two. Moreover, by also capturing time-synchronized information about a subject’s physical activ-ity, social context, and location, factors that lead to stress can also be inferred, and this informa-tion can potentially be used to provide personalized guidance about stress reduction. This system for population-scale medical studies is still in its early stages, and several technical and algorithmic challenges remain to be addressed. Energy is certainly one challenge. While some on-body sen-sors have high sampling rates leading to significant energy consumption (i.e., low battery life), the desire to facilitate easy compliance with the study protocols precludes a frequent charging sched-ule. However, a bigger challenge with this technology is the issue of information privacy and its tension with the quality and value of information [69].

23.3 Available SensorsThe research community has been very active in developing a plethora of wearable and wireless systems that seamlessly monitor HR, oxygen level, blood flow, respiratory rate, muscle activities, movement patterns, body inclination, and oxygen uptake (VO2). Given below is a concise sum-mary of some of the commercially available wireless sensor nodes for health monitoring:

◾ Pulse oxygen saturation sensors: They measure the percentage of hemoglobin (Hb) saturated with oxygen (SpO2) and HR

◾ Blood pressure sensors ◾ ECG ◾ Electromyogram (EMG) for measuring muscle activities ◾ Temperature sensors, both for core body temperature and skin temperature ◾ Respiration sensors ◾ Blood flow sensors ◾ Blood oxygen level sensor (oximeter) for measuring cardiovascular exertion (distress)

K15146_C023.indd 691 4/19/2013 6:15:59 AM

Page 16: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

692 ◾ Wireless Sensor Networks

23.4 Benefits of Using WBNAWhenever a WBNA is introduced to public use, the first question is how will it help me or what can it do for me? In this section, we discuss who benefits from WBNA applications in the health care field.

Everybody hears in the news every day that there is a shortage of specialized doctors or nurses around the country. One of the benefits of these applications is that they increase doctors’ and nurses’ efficiency. Now they can care for more patients than before. Their work is also improved and made easier by having access to accurate data on patients in real time. For patients whose health is online, the benefits are even greater. They have increased access to specialized doctors. They do not have to stick around the hospital any longer. This ease in mobility allows them to do their own work while still under the doctor’s care. Safety is another issue that is helped here because the rate of mistakes can significantly be decreased. Also, patients can be picky when mak-ing changes in their daily lives when signing up for a treatment. With wireless technologies, their health care can be less intrusive, for example, in the case of wearable sensors. They do not have to show up to the hospital for a blood-pressure check. It can be done while they are working through wearing wireless sensors that transmit this information in real time to their doctors.

As mentioned earlier, negligent mistakes by doctors and nurses cost hospitals, insurance companies, and the government a large sum of money each year. These costs can be reduced by reducing the number of mistakes. Efficient and secure data handling and resource management is another area where wireless networks can help. Deploying huge machines around the hospital can be very expensive and time consuming. With wireless technology, interfaces can be designed such that access to the machines can be provided from anywhere in the hospital. This allows for rapid and flexible deployment. By increasing the doctors’ and nurses’ efficiency, hospitals can provide care for more patients and increase their profits.

23.5 Challenges in a WBANThe human body consists of a complicated internal environment that responds to and interacts with its external surroundings but is, in a way, separate and self-contained. The human body envi-ronment not only has a smaller scale, but also requires a different type and frequency of monitor-ing with different challenges than those faced by WSNs. The monitoring of medical data results in an increased demand for reliability. The ease of use of sensors placed on the body leads to a small form factor that includes the battery and antenna part, resulting in a higher need for energy effi-ciency. In this section, some of the core challenges in designing WSNs for health care applications are described. While not exhaustive, the challenges in this list span a wide range of topics, from core computer system themes, such as scalability, reliability, and efficiency, to large-scale data min-ing and data association problems and even legal issues. The challenge of developing a tiny device that can perform complicated tasks and be tough enough to survive in the eye’s salty environment demands a diversity of talents and capabilities. Generally, health monitoring is performed on a periodic check basis, in which the patient must remember its symptoms; the doctor performs some tests and plans a diagnostic, then monitors patient progress along the treatment [71]. Health care applications of WSNs allow in-home assistance, smart nursing homes, and clinical trial and research augmentation [72]. Before describing usage of WBANs in vision repair, the following subsections focus on several challenges and general aspects that describe this kind of technology. Challenges in health care application include low power, limited computation, security and inter-ference, material constraints, robustness, continuous operation, and regulatory requirements [73].

K15146_C023.indd 692 4/19/2013 6:15:59 AM

Page 17: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 693

23.5.1 PowerAs most wireless network–based devices are battery operated, power challenge is present in almost every area of application of WSNs, but limitation of a smart sensor implanted on a person still poses even further challenge. In a full active mode, a node cannot operate more than a month because a typical alkaline battery provides about 50 W h of energy. In practice, for many applica-tions, they have to guarantee that the device will work for a year or two without any replacement. This could include devices such as heart pacemakers. To deal with these power issues, the develop-ers have to design better scheduling algorithms and power management schemes [74]. Although there is a lot of effort in designing low-power sensors to solve this problem [75], there is still a need for robust energy techniques, which is related to the new term “green technology.” The solar cells that can provide up to 15 m under direct sun cannot be used with body-worn wireless sensors because sensors are preferably to be placed under the clothing. Thus, motion [76] and body heat [77, 78] based energy techniques should be explored for health care systems.

23.5.2 ComputationBecause of both limited power as well as memory, computation should also be limited. The biosensors cannot perform large-bit computations because of lack of enough memory. Unlike conventional WSN nodes, biosensors do not have that much computational power. Because communication is vital, and memory is low, little power remains for computation. A solution is that some sensors may have vary-ing capabilities that communicate with each other and send out one collaborative data message [73].

23.5.3 Security and InterferenceOne of the very important issues that could be considered, especially for medical systems, is security and interference. Physiological data collected by the sensor network is health informa-tion, which is of personal nature. It is critical and in the interest of the individual to keep this information from being accessed by unauthorized entities. This is referred to as confidentiality, which can be achieved by encrypting the data by a key during transmission. Data authenticity is also one of the security requirements. This property is very important for the biosensor network because absence of this property may lead to situations in which an illegal entity is disguised as a legal one and reports false data to the control node or gives the wrong instructions to the other biosensors possibly causing significant harm to the host [79]. In [80], elliptic curve cryptography for key distribution to decrease energy consumption is proposed. However, there is still a need for more efficient cryptography methods to meet the security requirements. Another challenging issue with the security requirements for WBANs arises when the patients are unconscious as in a case when obtaining passwords may not be possible. In this case, biometric methods can be used for accountability as in [81] as well as the physiological signal-based authentication method proposed in [82]. Security attacks are especially problematic with low-power WSN platforms because of several reasons, including the strict resource constraints of the devices, minimal accessibility to the sensors and actuators, and the unreliable nature of low-power wireless communications. The security problem is further exacerbated by the observation that transient and permanent random failures are common in WBANs, and such failures are vulnerabilities that can be exploited by attackers. For example, with these vulnerabilities, it is possible for an attacker to falsify context, modify access rights, create denial of service, and, in general, disrupt the operation of the system. This could result in a patient being denied treatment, or worse, receiving the wrong treatment.

K15146_C023.indd 693 4/19/2013 6:15:59 AM

Page 18: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

694 ◾ Wireless Sensor Networks

23.5.4 Material ConstraintsAnother issue for WSN application to health care is material constraints. A biosensor should be implanted within the human body; therefore the shape, size, and materials must be harmless to the body tissue. For example, a smart sensor designed to support the retina prosthesis might be small enough to fit within an eye. Also chemical reactions with body tissue and the disposal of the sensor are of extreme importance.

23.5.5 MobilityThe purpose of WBANs and health-monitoring systems is to let people go about their life while accessing high-quality medical services. The use of sensor networks for this purpose is not new [83, 84]. However, the emergence of WBANs has enabled the development of applications to ensure and promote the mobility of users. Thus, the WBAN technologies enable ubiquitous health care systems. Mobility requires the development of multi-hop, multi-modal, ad hoc sensor networks, which bring the problems mentioned in the previous subsections along with the problem of location awareness.

23.5.6 RobustnessWhenever the sensor devices are deployed in harsh or hostile environments, robustness rates of device failure becomes high. Protocol designs must therefore have built-in mechanisms so that the failure of one node should not cause the entire network to cease operation. A possible solution is a distributed network in which each sensor node operates autonomously though still cooperates when necessary. For instance, if the sensor part is not working, the communication part should be used if it benefits the network and communication is operating as expected. One way to achieve this would be that a node might be comprised of a sensing block, a communication block, a sched-uling block, and a data block. This would be a good way to isolate the malfunctioning block from the rest of the components in the node as well as reducing power consumption among the various components. In order to ensure that the proper data is being sent and received, there are a few alternatives that can be used, such as checksums, parity check, and cyclic redundancy check [73].

23.5.7 Continuous OperationContinuous operation must be ensured along the lifecycle of a biosensor as it is expected to oper-ate for days, sometimes weeks without operator intervention. Hence, it is important to keep the amount of communications to the minimum. It is necessary that those communications that occur for purposes other than the actual data communication should be minimized if it is not possible to eliminate them [79].

23.5.8 Regulatory RequirementsRegulatory requirements must always be met; there must be some testimony that these devices will not harm human body. The wireless transmission of data must not harm the surrounding tissues, and the chronic functioning and power utilization of these devices must also be nonmalignant. Design for safety must be a fundamental feature of biomedical sensor development even at the earliest stages [73]. It is conceivable that some immoral researchers could perform tests and trials with devices that are dan-gerous to the volunteers. Therefore, it is imperative to have diligent oversight of these testing operations.

K15146_C023.indd 694 4/19/2013 6:15:59 AM

Page 19: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 695

23.5.9 Sensitivity of SensorsThe sensitivity of the sensor devices is especially important when users wear these sensors in harsh environments such as in fire situations. Sweat can affect the transducers of the sensor devices negatively, causing a reduction in the sensitivity of the body-worn sensors or requiring recalibra-tion of the sensors. Gietzelt et al. [85] proposed an automatic self-calibration algorithm for triaxial accelerometers. Yet, the self-calibration and sensitivity enhancement algorithms are still needed for sensor devices different than accelerometers. Low-maintenance and highly sensitive vital-sign monitoring sensors will attain importance as pervasive health care systems evolve.

23.6 Artificial RetinaPervasive computing will change the computing landscape, enabling the implementation of new applications that were never imagined. As the population of the United States ages, vision loss is becoming an increasingly serious public health problem. Already, approximately 3.3 mil-lion Americans over the age of 40, or one person in 28, are either blind or have low vision [86] (vision so poor that it significantly interferes with everyday life and cannot be corrected, even with eyeglasses). The main goal of applying WBAN in vision rehabilitation is to build a chronically implanted artificial retina for visually impaired people, addressing two retinal diseases: age-related macular degeneration (AMD) and retinitis pigmentosa (RP).

23.6.1 Retinal Diseases: AMD and RPTwo major causes of blindness, AMD and RP, damage the photoreceptor cells in the light-sensitive membrane in the eye (retina) but leave the nerve connections to the brain intact; see Figure 23.13. Patients eventually lose their vision. A healthy photoreceptor stimulates the brain through electric impulses when light is illuminated from the external world. When damaged, vision is blocked at the locations of the photoreceptors.

AMD is an eye disease primarily affecting the central vision regions in people age 60 and older [87]. According to the Macular Degeneration Research Fund, a case of AMD is diagnosed in the United States every 3 min. Each year, 1.2 million of the estimated 12 million people with AMD will suffer severe vision loss. Patients with AMD have dark areas in their vision caused by fluid leakage or bleeding in the macula, the center of the retina that produces the sharpest vision. The brain initially compensates for these dark patches. Early cellular dysfunction or spotting in the macula may go undetected until the disease is in advanced stages.

RP is the most common inherited cause of blindness in people between the ages of 20 and 60 worldwide. Approximately 500,000 people in the United States suffer some level of visual impair-ment from RP, and, of those, 20,000 are totally blind. RP is a degenerative disease of the retina that affects the photoreceptor, or rod cells, which control a person’s ability to see in dimly lit surround-ings. Vision loss is gradual and may result in diminished or lost peripheral vision or blindness.

A person with normal vision has frontal and peripheral vision as shown in Figure 23.14. The eye can see an image by interpreting light hitting the back light-sensing cells, the rods and cones that line the retina. If something happens to these cells, then vision distortion or even blindness occurs [88].

A person with RP loses much of his or her peripheral vision and sees an effect sometimes called “tunnel vision.” A person with AMD loses some vision usually in the center of their field of vision as shown in Figure 23.15.

K15146_C023.indd 695 4/19/2013 6:15:59 AM

Page 20: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

696 ◾ Wireless Sensor Networks

IrisCrystalline

lens

Pupil

Macula(vision center)

Optic nerve(to brain)

Retina

VitreousSclera

Cornea

Figure 23.13 Human eye viewed from the side.

Opticnerve

Conjunctiva

Lens

Pupil

Sclera

Retina

Cornea

Iris

Figure 23.14 Normal vision.

K15146_C023.indd 696 4/19/2013 6:16:01 AM

Page 21: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 697

23.6.2 Architecture of the Human RetinaFor the layman, it is necessary to understand the unique architecture of the human retina. There are five layers to the retina (as determined by looking at slices through a light microscope). In a strange way, the light coming into the eye must pass through four of the layers before it reaches the section of the retina where the photocells receive and react to the light. When the photocells fire, they send signals back toward the surface of the retina through the four layers above. Various levels of image processing take place as the signals leave the photocells and transverse the other layers. The ganglion cells are the final layer for retinal processing. Their axons (nerve fibers leaving the cell

Figure 23.15 How RP and AMD affect vision.

K15146_C023.indd 697 4/19/2013 6:16:01 AM

Page 22: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

698 ◾ Wireless Sensor Networks

body) are actually the nerves (about a million of them) that leave the retina and travel back into the brain. There is debate among scientists about the best approach to creating artificial vision. The first epi-retinal team believes that resting the chip on the ganglion cells (and their axons) is easier to do surgically and avoids the damaged (diseased) tissue below. The second sub-retinal group feels that it is best to surgically “cut” beneath the four layers and place the chip directly in the (dam-aged) photoreceptor layer. The eye is only about one inch in diameter. Retinal implants have to be very small and be surgically placed precisely into a thin strip of tissue (the retina is sometimes compared to a wet slice of Kleenex). To get to the retina, surgeons have to remove much of the internal part of the eye. The lens may have to be removed and replaced and the vitreous is removed and later a salient solution is injected as a replacement. The skill of the surgeon is very important in these delicate operations.

23.6.3 How Artificial Retina WorksNormal vision begins when light enters and moves through the eye to strike specialized pho-toreceptor (light-receiving) cells in the retina called rods and cones, as shown in Figure 23.16. These cells convert light signals to electric impulses that are sent to the optic nerve and the brain. Retinal diseases such as AMD and RP destroy the photoreceptor cells. The artificial retina device bypasses these cells to transmit signals directly to the optic nerve. The device consists of a tiny camera and microprocessor mounted in eyeglasses, a receiver implanted behind the ear, and an

Retina

Receiver

Microprocessor

Retinal implant

Optic nerveto brain

Camera

Retinalimplant

Tack

Cable

Ganglion cells

Photoreceptors(rods and cones)

Area ofphotoreceptorsdestroyed bydisease

Adapted, with permission, from IEEE Engineering or Medicine and Biology 24:15(2005).

Implant tackedto retina

Figure 23.16 Implantable artificial retina for blind people.

K15146_C023.indd 698 4/19/2013 6:16:02 AM

Page 23: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 699

electrode-studded array that is tacked to the retina. A wireless battery pack worn on the belt powers the entire device. The camera captures an image and sends the information to the microprocessor, which converts the data to an electronic signal and transmits it to the receiver. The receiver sends the signals through a tiny cable to the electrode array, stimulating it to emit pulses. The pulses travel through the optic nerve to the brain, which perceives patterns of light and dark spots cor-responding to the electrodes stimulated. Patients learn to interpret the visual patterns produced.

23.6.4 Progression of Disease StatesThe design and success of an implant depends on the situation in the eye or cortex at the time of surgery [87]. Most of the retinal implants are (currently) targeted at RP patients. There are many kinds of RP depending on the location and extent of genetic damage; some appear sooner in life, some later. Some forms are severe and acute; other forms are more chronic and play out over a longer time frame.

The success of a retinal implant depends on how much retina is left (that is healthy enough to receive a chip). There are stages of disease progression in RP. A remodeling (of the five-stage retinal cell layer architecture) and a rewiring of the neurons occur as the disease progresses. Blood vessels also reposition and relocate as the PR unfolds. The astrocytes and meuller cells (especially) proliferate as the disease advances. No two eyes are ever alike. The success or failure of an implant depends on individual circumstances.

A very chemically powerful neurotransmitter called glutamate is associated with retinal pho-tocells. Glutamate works fine as long as it is contained inside its normal cycle of operation. When disease processes begin in the retina, glutamate is released into tissues that it was not designed to come in contact with. In effect glutamate is “dynamite;” it becomes a cellular toxin, poison to everything around it. Glutamate is somehow involved in the breakdown of the photocell layer in RP. But what makes RP patients eligible for retinal implants is that, although the rods and cones deteriorate (from glutamate poisoning), 50% to 75% of the ganglion cells survive, and approxi-mately 80% of the bipolar and amacrine cells survive. In other words, RP is a disease that targets one layer of the retina severely (only about 5% of the photoreceptor survives) but more or less spares the other (processing and transmission) layers; that is, only the sensors are (initially) dam-aged severely in PR. Other complicating factors with RP include a marked decrease in blood flow (circulation drops 80%), a breakdown in the layers (so that it is difficult to determine them), and a progression, in some cases, to a retinal soup that leaves no viable retina for implantation (end stage of some varieties of the disease). For an implant to work, there must be preservation of some retinal layers.

23.6.5 Advancements in Using WBAN in Artificial RetinaIn the 1960s, Giles S. Brindley of the University of Cambridge in England attached 80 electrodes to miniature radio receivers and implanted them into the brain of a blind patient. The patient reported seeing phosphenes (flashes of light) [89]. William Dobelle began experiments with cortical stimulation in human volunteers during the 1970s. His early implant system was (understandably) bulky by today’s standards. Large wires protruded from a hole drilled in the subject’s skull and ran to a large computer processing system. The patient saw light blobs (phosphenes) in black and white. Some of Dr. Dobelle’s first implanted patients have used the system (without infection or severe complication) for more than 20 years. In 1992, a blind volunteer at the US National Institute of Neurological Disorders and Stroke learned to recognized phosphene letters (using an implant).

K15146_C023.indd 699 4/19/2013 6:16:02 AM

Page 24: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

700 ◾ Wireless Sensor Networks

In the early 1990s, scientists led by Mark Humayun (then at Johns Hopkins University, now at Doheny) made an important discovery that led to an opportunity for intervention. They had discovered that, in the cases of certain retinal diseases RP and AMD, the parts of the eye that were damaged were the photoreceptor cells of the retina, the rods and cones that normally convert light into electrical impulses. But the diseases leave intact the neural paths to the brain, paths that transport the electrical signals. Thus, while the input from the photoreceptors eventually ceases, up to 70% to 90% of the nerve structures set up to receive those inputs remained intact.

The general thought at this time was that once you lose the photoreceptors you lose the rest of the circuitry as well. They found that the nerve cells in the eye are left fairly intact, so the task they had was how to take images and convert them into tiny electrical pulses. That is exactly what the photoreceptors do; they capture light and convert it into chemical pulses, and this information is sent to the brain, and it allows us to see.

Schwiebert et al. [73] developed a microsensor array that can be implanted in the eye as an artificial retina to assist people with visual impairments. In the Smart Sensors and Integrated Microsystems (SSIM) project, a retina prosthesis chip consisting of 100 microsensors is built and implanted within human eye as shown in Figure 23.17. This allows patients with no vision or lim-ited vision to see at an acceptable level. The wireless communication is required to suit the need for feedback control, image identification, and validation. The communication pattern is deter-ministic and periodic, so TDMA fits best in this application to serve the purpose of energy con-servation. Two group communication schemes are investigated: a LEACH-like cluster head–based approach and a tree-based approach. The system consists of an integrated circuit and an array of sensors. The integrated circuit is a multiplexer with on-chip switches and pads to support a 10 × 10 grid of connections; it operates at 40 kHz. Moreover, it has an embedded transceiver for wired and wireless communications. Each connection in the chip interfaces a sensor through an aluminum probe surface. Before the bonding is done, the entire integrated circuit, except the probe areas, is coated with a biologically inert substance. Each sensor is a microbump. It starts with a rectangu-lar shape but, near the end, tapers to a point and rests gently on the retina tissue. The sensors are sufficiently small and light to be held in place with relatively little force. The distance between adjacent microbumps is approximately 70 µm. The sensors produce electrical signals proportional to the light reflected from an object being perceived. The ganglia and additional tissues transform the electrical energy into chemical energy, which, in turn, is transformed into optical signals and

Wide bandgapsemicondutor

Microbump extrusions

SiC

35 µm

15 µm

50 µm25 µm

20 µm

1.5 µm

10 µm2

Figure 23.17 Illustration of the microsensor array.

K15146_C023.indd 700 4/19/2013 6:16:03 AM

Page 25: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 701

communicated to the brain through the optic nerves. The magnitude and wave shape of the trans-formed energy corresponds to the response of a normal retina to light stimulation.

The system is a full duplex system, allowing communication in a reverse direction. In addition to the transformation of electrical signals into optical signals, neurological signals from the ganglia can be picked up by the microsensors and transmitted out of the sensing system to an external signal processor. In this way, the sensor array is used as a reception and transmission system in a feedback loop. Two types of wireless communications are foreseen. First, the eventual mapping of an input electrical signal to brain patterns cannot be realized internally with the sensing system alone because signal processing is a computationally intensive process. Second, diagnostic and maintenance operations require the extraction of data from the sensing system. For these rea-sons, interconnecting the sensing system with an external system is essential. In addition to these requirements, communication is required to transfer data from a charge-coupled device (CCD) camera (embedded in a pair of spectacles) to the sensor array. Figure 23.18 illustrates the signal-processing steps of the artificial retina. A camera embedded in a pair of spectacles directs its output to a real-time digital signal processor (DSP) for data reduction and processing. The camera can be combined with a laser pointer for automatic focusing. The output of the DSP is compressed and transmitted through a wireless link to the implanted sensor array, which decodes the image and produces a corresponding electrical signal.

A recent research project at Wayne State University and the Kresge Eye Institute developed an artificial retina using wireless biomedical sensors [89]. The project aimed to build a chroni-cally implanted artificial retina with sufficient visual functionality to allow persons without vision or with limited vision to “see” at an acceptable level. In June 2002, at the 48th meeting of the American Society for Artificial Organs, Dr. William Dobelle announced that his team had cre-ated a commercially available artificial eye. The cost for the system is $98,000. The Dobelle artifi-cial eye uses a miniature digital video camera mounted on the lens of sunglasses. Images from the camera are sent to a computer-processing unit on the belt. The cortical chip implant is embedded in the brain through a hole drilled in the skull. Cables run out of the head down to a stimulating control system (also on the belt). Inside the head, the chip contains electrodes that penetrate the brain in the area of the primary visual cortex. Images from the camera go to the microcomputer on the belt where the processor determines which electrodes are to be stimulated. Signals are sent to the stimulator unit, which activates selected electrodes in the implant. The implants hold 16 electrode arrays (placed on the mesial surface of the occipital lobe). It is possible that, eventually,

Camera imageprocessing unit DSP

Power/data(RF)

transmission

In-vivowirelessreceiver

Retinachip

Neuralinterface

Sensing systemWiredcommunication

Wirelessinterface

Image post processing;data reduction andfeature extraction

CDD camer(1000 × 1000 image)

Figure 23.18 Processing components of the artificial retina. (From L. Schwiebert et al., Research challenges in wireless networks of biomedical sensors, In The 7th Annual International Conference on Mobile Computing and Networking, Rome, Italy, pp. 151–165, 2001.)

K15146_C023.indd 701 4/19/2013 6:16:03 AM

Page 26: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

702 ◾ Wireless Sensor Networks

combinations of cortical and retinal chips networked together might be used to provide even higher degrees of visual processing than can be attained using either approach alone. The system can be disconnected from the camera and plugged directly into a television or a computer, allow-ing direct access to these technologies. Low resolution does not allow for “normal” viewing of these screens, but it does demonstrate how future chip technologies could be directly linked to output devices. A video monitor can also be patched into the system so that observers can monitor what the patient is viewing.

The DOE Artificial Retina Project [87] is a multi-institutional collaborative effort to develop and implant a device containing an array of microelectrodes into the eyes of people blinded by retinal disease. Ultimately, the goal is to restore useful vision for patients blinded by AMD and RP. This work builds upon a first-generation device containing a 16-electrode array on a minia-ture disc that can be implanted in the back of the eye to replace a damaged retina. After years of research and laboratory prototypes, artificial vision is being tested in humans for the first time. Artificial vision researchers take inspiration from another device, the cochlear implant, which has successfully restored hearing to thousands of deaf people. But the human vision system is far more complicated than the hearing system. The eye perceives millions of distinct points of light. Light entering through the pupil is converted to electrical signals, anatomical rods and cones, the light sensitive cells in the retina. Those electrical pulses are carried through the optic nerve to the brain, where they yield images to the world.

The Model 1 device [developed by Second Sight Medical Products, Inc. (SSMP)] was implanted in six blind patients between 2002 and 2004, whose ages ranged from 56 to 77 at the time of implant and all of whom have RP; see Figure 23.19. The device consists of a 16-electrode array in a one-inch package that allows the implanted electronics to wirelessly communicate with a camera mounted on a pair of glasses. It is powered by a battery pack worn on a belt. This implant enables patients to detect when lights are on or off, describe an object’s motion, count individual items,

Figure 23.19 Argus I is a retinal prosthesis that contains a small implantable chip with elec-trodes. These electrodes stimulate the retina and help people regain limited vision.

K15146_C023.indd 702 4/19/2013 6:16:04 AM

Page 27: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 703

and locate objects in their environment. To evaluate the long-term effects of the retinal implant, five devices have been approved for home use. Figure 23.20 shows the use of a small external camera to transmit images to an implanted 4 × 5 mm retina chip with 16 electrodes, which is positioned near the ganglion cell layer of the eye.

The smaller, more compact Model 2 retinal prosthesis (developed by SSMP with DOE con-tributions) is currently undergoing clinical trials to evaluate its safety and utility. This model is much smaller, containing 60 electrodes, and surgical implant time has been reduced from the 6 h required for Model 1 to 2 h.

The Model 3 device, which will have more than 200 electrodes, has undergone extensive design and fabrication studies at the DOE national laboratories and is ready for preclinical testing. The new design uses more advanced materials than the two previous models and has a highly com-pact array. This array is four times more densely packed with metal contact electrodes and required wiring connecting to a microelectronic stimulator. Simulations and calculations indicated that the 200+ electrode device should provide improved vision for patients.

Figure 23.21 approximates what patients with retinal devices see. Increasing the number of electrodes will result in more visual perceptions and higher-resolution vision. Scientists worked with patients who received the Model 1 implant (a 4 × 4 array of 16 electrodes) for about one month after surgery to help them interpret what they saw.

In [90], a practical WSN application that will enable blind people to see is proposed. The chal-lenge is that the actual mapping between the input images to the brain pattern is too complex and

Figure 23.20 Using a small external camera to transmit images to an implanted 4 mm × 5 mm retina chip with 16 electrodes, which is positioned near the ganglion cell layer of the eye.

K15146_C023.indd 703 4/19/2013 6:16:04 AM

Page 28: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

704 ◾ Wireless Sensor Networks

not well understood. The connectivity problem in 3D/2D WSNs is studied, and the distribution of efficient algorithms to accomplish the required connectivity of the system are proposed. A new connectivity algorithm CDCA to connect disconnected parts of a network using cooperative diversity is provided.

The goal of this proposed approach is to allow people with no vision or with low vision to see what their neighboring partners can see. Unlike some other artificial vision proposals, this approach is applicable to virtually all causes of blindness. This device may also help some legally blind low vision patients because the cortex of sighted people responds to stimulation similarly to the cortex of blind people. A lot of work has been done to try to imitate the functionality of the human eye, and artificial vision techniques have been proposed. The challenge is that the actual mapping between the input images to a brain pattern is too complex and not well understood. In our case, we do not need to understand that complex mapping pattern because human brains function in a similar way, and the input to one brain (sent from another human with normal vision) could well be analyzed in the same way as if it was originated from the blind man’s brain. Because the sensor devices are placed in the human body, the research problems differ from traditional WSNs. Most work con-cerning artificial vision has included a digital camera transmitting wireless data to a microchip in the eye [91]. The challenge is that the actual mapping between the input images to a brain pattern is too complex and not well understood. Because the sensors are irreplaceable and in case of emer-gency, we need to report to the central computer, and we require that our network is connected at all times if some sensor nodes were deployed on nearby objects to form a connected network.

The sensor implants are inside both the eye of the blind man’s eye and a normal man’s (or dog’s) eye (Figure 23.22). Vision is established using the following steps:

A. Light entering through the pupil is converted to electrical signals. Those electrical pulses are converted to data and are stored in the retinal implant of the normal eye.

B. The data is then transmitted using the wireless transceiver to the blind man’s eye. C. The signal travels along a tiny wire to the retinal implant. D. The back side of the retina is stimulated electrically by the sensors on the chip. E. These electrical signals are converted to chemical signals by tissue structures, and the

response is carried via the optic nerve to the brain.

The unique difference between this approach and any other is using the data from one man’s eye. There is no need to understand the complex mapping of an image from the retina to the brain

16 × 16256 Pixels

4 × 416 Pixels

32 × 321000+ Pixels

Figure 23.21 Images generated by the Artificial Retinal Implant Vision Simulator devised and developed by Wolfgang Fink at the Visual and Autonomous Exploration Systems Research Laboratory, California Institute of Technology.

K15146_C023.indd 704 4/19/2013 6:16:05 AM

Page 29: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 705

via the optic nerve. This application might be very useful in scenarios where the blind person and his partner are focusing their attention on a static phenomenon (e.g., watching a movie, attending a lecture, etc.). Cooperative diversity utilizes intermediate transmitters as repeaters to extend the range of transmission of a source. The advantage of cooperation is the accumulation of power from all relays, which is a source of diversity, and also of signal-to-noise ratio improvement [92–95].

Other retinal prostheses projects are underway in the United States and worldwide, including in Germany, Japan, Ireland, Australia, Korea, China, and Belgium. These programs pursue many different designs and surgical approaches. Some show great promise for the future, but have yet to demonstrate practicality in terms of adapting to and lasting long-term in a human eye. Thus far, the projects that have progressed to clinical (human) trials are the collaborative DOE effort, a project at the now-defunct Optobionics (Chicago), and two efforts in Germany at Intelligent Medical Implants AG and Retinal Implant AG. (For more information on worldwide projects, see Science 312, 1124–1126, 2006).

23.7 ConclusionThis survey shows that WSN technology offers significant potential in numerous application domains. Given the diverse nature of these domains, it is essential that WSNs perform in a reliable and robust fashion. WSNs have proven their usage in the future distributed computing environ-ment. However, there are significant amounts of technical challenges and design issues, and those needs to be addressed.

Should this be included in the reference list? If yes, please pro-vide the complete details of the reference entry.

Wirelesscommunication

Eye ball

Chip

Optic nerve

Normal man

(a)

(b)Blind man

Figure 23.22 Proposed approach for eye vision. (From M. K. Watfa, Int. J. Inform. Commun. Eng. 4, 5, 2008.)

K15146_C023.indd 705 4/19/2013 6:16:05 AM

Page 30: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

706 ◾ Wireless Sensor Networks

This survey paper shows that WBANs can be widely used in medical applications and how these new applications can be used to potentially reduce costs for hospitals, government, and insurance companies. With wireless network–based medical technologies, applications can be designed to be less intrusive in patients’ daily lives. It has surveyed a number of systems and appli-cations for health care, mainly wearable and implantable sensors. As nanotechnology provides smaller and multiple-functional sensor nodes, the further these new small body networks may evolve, making their use as natural as wearing clothes in the near future. Given the importance of addressing ways to provide smart health care for the elderly, the chronically ill, and children, researchers have started to explore technological solutions to enhance the provision of health and social care in a way that complements existing services. The common types of wireless sensors are mentioned, and the benefits of using wireless networks for medical applications are discussed. The major challenges and open research problems of WBANs are discussed briefly. This paper con-centrates on applying WBAN in vision rehabilitation and how to build a chronically implanted artificial retina for visually impaired people. This approach addresses two retinal diseases: AMD and RP. The architecture of the human retina, the work of artificial retinas, and the progression of disease states are described in detail. Advancements in using WBAN in artificial retinas are dis-cussed to illustrate the enormous development this approach and its extent of technical efficiency.

23.8 Future DirectionsWith their potential uses in the medical and home health care fields, wireless networks have an important contribution to improving lives of patients. Besides bringing comfort to patients, there are large commercial benefits in the area of reducing costs and improving equipment and patient management. In the near future, the use of WBANs in vision repair will increase because it allows patients to be more self-sufficient, and the implant could turn out to be cheaper than governments paying for higher levels of health care or in-home care for patients. The system can therefore reach ubiquity, when each individual would have a computational module able to seamlessly interact with the smart space’s system to overcome blindness.

As stated, using WBNA in artificial retinas is certainly a step in the right direction but, unfortu-nately, does not help those with traumatic retinal detachments or with tears that may lead to retinal detachments. Nonetheless, it is progress, and hopefully that progress will continue so that one day a retinal detachment does not result in a complete loss of vision. Also improving operations is the providing of early warning of system failure and a predictive maintenance tool, improving energy management, providing automatic record keeping for regulatory compliance, eliminating person-nel training costs, or reducing insurance costs. The collaboration and synergy of sensing, process-ing, communication, and actuation is the next step to exploit the potential of these technologies.

These are the general considerations or challenges that will face the future of retinal prosthesis. All these considerations are important and should be taken into consideration in future works.

1. The artificial retina must deliver visual information in real time, just like the human retina. Having extra delay would mean that the patient would see the scene later than a person with a normal vision. This could lead the patient to make wrong and dangerous decisions such as crossing a street at the wrong time.

2. Artificial retinal power must not burn or injure the remaining part of the retina or the eye. The human retina consumes a 10th of a watt [96]. An artificial retina should consume no more than that.

K15146_C023.indd 706 4/19/2013 6:16:05 AM

Page 31: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 707

3. The artificial retina must be small just like human retina, that is, half a millimeter thick and less than a cm2 in area.

4. The artificial retina must be biocompatible. 5. The prosthesic device interface that will stimulate the remaining functional part of the

human retina must be convex to conform to the convex shape of the retina. 6. The signal amplitude should be variable from patient to patient. Retinal tissue resistance will

vary from person to person, and this implies that the device should accommodate increasing or decreasing the electrical signal amplitude.

7. The human retina processes the visual scene information to enhance the image and possibly to extract useful and variant information from the scene. In various layers, the human ret-ina performs spatial averaging, contrast detection, adaptation, and more. Creating a retinal prosthesic device should process the visual scene in a similar way.

8. A chip that can be implanted with a smaller surgical cut is highly preferred. Engineering a chip that requires the surgeons to create a large surgical cut might be considered risky and may not implantable.

References 1. N. Bulusu, Estrin, Girod, Heidemann, Scalable coordination for wireless sensor networks: self-configuring

localization systems, In Proceedings of International Symposium on Communication Theory and Applications (ISCTA 2001), Ambleside, UK, July 2001, pp. 1797–1800, 2001.

2. A. Cerpa and D. Estrin, ASCENT: adaptive self-configuring sensor networks topologies, Proc. IEEE INFOCOM 2002, 1278–1287, 2002.

3. Y. H. Nam et al. Development of remote diagnosis system integrating digital telemetry for medicine, In Proceedings of International Conference IEEE-EMBS, Hong Kong, pp. 1170–1173, 1998.

4. U. Varshney, Pervasive healthcare and wireless health monitoring, Mobile Netw. Appl., 12, 113–127, March 2007.

5. K. W. Goh, J. Lavanya, Y. Kim, E. K. Tan, and C. B. Soh, A PDA-based ECG beat detector for home car-diac care, In IEEE Engineering in Medicine and Biology Society, Shanghai, China, pp. 375–378, 2005.

6. H. Zhou, K. M. Hou, J. Ponsonnaille, L. Gineste, and C. D. Vaulx, A real-time continuous cardiac arrhythmias detection system: RECAD, In IEEE Engineering in Medicine and Biology Society, Shanghai, China, pp. 875–881, 2005.

7. H. Chu, C. Huang, Z. Lian, and T. Tsai, A ubiquitous warning system for asthma inducement, In IEEE International Conference on Sensor Networks, Ubiquitous and Trustworthy Computing, Taichung, Taiwan, pp. 186–191, 2006.

8. L. Huaming and T. Jindong, Body sensor network based context aware QRS detection, In Pervasive Health Conference and Workshops, Innsbruck, Austria, pp. 1–8, 2006.

9. J. Luprano, J. Sola, S. Dasen, J. M. Koller, and O. Chelelat, Combination of body sensor networks and on-body signal processing algorithms: the practical case of MyHeart project, In International Workshop on Wearable and Implantable Body Sensor Networks (BSN 2006), Cambridge, MA, USA, 2006.

10. R. Jafari, A. Encarnação, A. Zahoory, F. Dabiri, H. Noshadi, and M. Sarrafzadeh, Wireless sensor networks for health monitoring, In Second Annual International Conference on Mobile and Ubiquitous Systems: Networking and Services (MobiQuitous 2005), pp. 479–481, 2005.

11. O. Chipara, C. Lu, T. C. Bailey, and G. C. Roman, Reliable patient monitoring: a clinical study in a step-down hospital unit, Department of Computer Science and Engineering, Washington University. St. Louis, St. Louis, MO, Technical Report WUCSE-2009-82, December 2009.

12. J. Ko, J. Lim, Y. Chen, R. Musaloiu-E., A. Terzis, G. Masson, T. Gao, W. Destler, L. Selavo, and R. Dutton, MEDiSN: medical emergency detection in sensor networks, ACM Trans. Embedded Comput. Syst., 10, 1, article 11, 11:1–11:29, 2010.

Please provide first name initials of author names.

Please provide volume.

K15146_C023.indd 707 4/19/2013 6:16:05 AM

Page 32: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

708 ◾ Wireless Sensor Networks

13. T. Gao, C. Pesto, L. Selavo, Y. Chen, J. Ko, J. Lim, A. Terzis, A. Watt, J. Jeng, B. Chen, K. Lorincz, and M. Welsh, Wireless medical sensor networks in emergency response: implementation and pilot results, In IEEE Int. Conf. Technol. Homeland Security, pp. 187–192, 2008.

14. D. Malan, T. Fulford-Jones, M. Welsh, and S. Moulton, Code Blue: an ad hoc sensor network infra-structure for emergency medical care, In Proceedings of the MobiSys/Workshop on Applications for Mobile Embedded Systems, pp. 12–14, June 2004.

15. G. Virone, A. Wood, L. Selavo, Q. Cao, L. Fang, T. Doan, Z. He, and J. A. Stankovic, An advanced wireless sensor network for health monitoring, In Proceedings of the Transdisciplinary Conference on Distributed Diagnosis and Home Healthcare, pp. 95–100, April 2006.

16. S. Kumar, Autosense, NIHGEI project at The University of Memphis, 2007. [Online]. Available: http://www.cs.memphis.edu/~santosh/AutoSense.html.

17. K. Patrick, A tool for geospatial analysis of physical activity: physical activity location measurement system (PALMS), NIHGEI project at the University of California at San Diego, 2007. [Online]. Available: http://www.gei.nih.gov/exposurebiology/program/docs/KevinPatrick.pdf.

18. Q. Tang, N. Tummala, S. Gupta, and L. Schwiebert, Communication scheduling to minimize ther-mal effects of implanted biosensor networks in homogeneous tissue, IEEE Trans. Biomed. Eng., 52, 1285–1294, 2005.

19. E. Jovanov, A. Milenkovic, C. Otto, and P. de Groen, A wireless body area network of intelligent motion sensors for computer assisted physical rehabilitation, J. Neuroeng. Rehabil., 1, 2–6, 2005.

20. J. Coosemans and R. Puers, An autonomous bladder pressure monitoring system, Sens. Actuat. A, 123–124, 155–161, 2005.

21. P. Troyk, M. Schwan, Closed-loop class E transcutaneous power and data link for micro-implants, IEEE Trans. Biomed. Eng., 39, 589–599, 1992.

22. M. Quwaider and S. Biswas, Body posture identification using hidden Markov model with a wearable sensor network, In Proceedings of the ICST 3rd International Conference on Body Area Networks, Tempe, AZ, USA, pp. 1–8, March 2008.

23. E. Krames, Implantable devices for pain control: spinal cord stimulation and intrathecal therapies, Best Pract. Res. Clin. Anaesthesiol., 16, 4, 619–649, December 2002.

24. K. Chen and D. J. Bassett, The technology of accelerometry-based activity monitors: current and future, Med. Sci. Sports Exerc., 37, 490–500, 2005.

25. M. Quwaider and S. Biswas, Physical context detection using multimodal sensing using wearable wire-less networks, J. Commun. Softw. Syst., 4, 191–202, 2008.

26. K. W. Goh, J. Lavanya, Y. Kim, E. K. Tan, and C. B. Soh, A PDA-based ECG beat detector for home cardiac care, In IEEE Engineering in Medicine and Biology Society, Shanghai, China, pp. 375–378, 2005.

27. J.-L. Lin, H. C. Liu, Y.-T. Tai, H.-S. Wu, S.-J. Hsu, F.-S. Jaw, and Y.-Y. Chen, The development of wireless sensor network for ECG monitoring, In The 28th Annual International Conference of the IEEE, Engineering in Medicine and Biology Society, New York, USA, pp. 3513–3516, 2006.

28. S. A. Taylor and H. Sharif, Wearable patient monitoring application (ECG) using wireless sensor net-works, In 28th Annual International Conference on the IEEE Engineering in Medicine and Biology Society, New York, USA, pp. 5977–5980, 2006.

29. J. Kolbe, W. Fergursson, and J. Garret, Rapid onset asthma: a severe but uncommon manifestation, Thorax, 53, 241–247, April 1998.

30. S. Sur, T. Crotty, G. Kephart, B. Hyma, T. Colby, C. Reed, L. Hunt, and G. Gleich, Sudden-onset fatal asthma: a distinct entity with few eosinophils and relatively more neutrophils in the airway submucosa?, Am. Rev. Respir Dis., 148, 3, 1993, PMID: 8368644 [PubMed – indexed for MEDLINE].

31. K. Lorincz, D. J. Malan, T. R. F. Fulford-Jones, A. Nawoj, A. Clavel, V. Shnayder, G. Mainland, M. Welsh, and S. Moulton, Sensor Networks for Emergency Response, IEEE Pervasive Comput., 3, 16–23, October–December 2004.

32. E. Jovanov, A. O. D. Lords, D. Raskovic, P. G. Cox, R. Adhami, and F. Andrasik, Stress monitoring using a distributed wireless intelligent sensor system, IEEE Eng. Med. Biol., 49–55, 2003.

33. Y. J. Zhao, A. Davidson, J. Bain, S. Q. Li, Q. Wang, and Q. Lin, A MEMS viscometric glucose moni-toring device, In The 13th International Conference on Solid-State Sensors, Actuators and Microsystems, TRANSDUCERS ‘05, pp. 1816–1819, 2005.

Please provide accessed date.

Please provide accessed date.

Please provide page numbers.

Please provide volume number.

Please provide location.

K15146_C023.indd 708 4/19/2013 6:16:05 AM

Page 33: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 709

34. I. F. Akyildiz, W. Su, Y. Sankarasubramaniam, and E. Cayirci, A survey on sensor networks, IEEE Commun. Mag., 40, 8,102–114, August 2002.

35. National Center for Health Statistics, URL: http://www.cdc.gov/nchs/Default.htm, Accessed in December 2011.

36. A. Darwish and A. E. Hassanien, Wearable and implantable wireless sensor network solutions for healthcare monitoring, Sensors, 11, 5561–5595, 2011, doi: 10.3390/s110605561.

37. L. Theogarajan, J. Wyatt, J. Rizzo, B. Drohan, M. Markova, S. Kelly, G. Swider, M. Raj, D. Shire, M. Gingerich, J. Lowenstein, and B. Yomtov, Minimally invasive retinal prosthesis, In Solid-State Circuits, 2006 IEEE International Conference Digest of Technical Papers, pp. 99–108, February 2006.

38. T. Penzel, B. Kemp, G. Klosch, A. Schlogl, J. Hasan, A. Varri, and I. Korhonen, Acquisition of bio-medical signals databases, In IEEE Engineering in Medicine and Biology Magazine, 20, 3, 25–32, May/June 2001.

39. S. Arnon, D. Bhastekar, D. Kedar, and A. Tauber, A comparative study of wireless communication network configurations for medical applications, IEEE Wirel. Commun. [see also IEEE Pers. Commun.], 10, 1, 56–61, February 2003.

40. B. Gyselinckx, J. Penders, and R. Vullers, Potential and challenges of body area networks for cardiac monitoring, In ISCE 32nd Annual Conference, issue 6, supplement 1, pp. s165–s168, J. Electrocardiol., 40, 6, S165–S168, November–December 2006.

41. A. H. Mark, C. P. Harry, T. B. Adam, R. Kyle, H. C. Benton, H. A. James, and L. John, Body area sensor networks: challenges and opportunities, In IEEE Computer Society, Atlantic City, NJ, USA, pp. 58–65, 2009.

42. IEEE standard for safety levels with respect to human exposure to radio frequency electromagnetic fields, 3 khz to 300 ghz, 1999.

43. G.-Z. Yang, Ed., Body Sensor Networks, Springer-Verlag London Limited, 2006. 44. D. Malan, T. Fulford-Jones, M. Welsh, and S. Moulton, CodeBlue: an ad hoc sensor network infra-

structure for emergency medical care, In Proceedings of the MobiSys/Workshop on Applications for Mobile Embedded Systems, pp. 12–14, June 2004.

45. J. Ko, J. Lim, Y. Chen, R. Musaloiu-E., A. Terzis, G. Masson, T. Gao, W. Destler, L. Selavo, and R. Dutton, MEDiSN: medical emergency detection in sensor networks, ACM Trans. Embedded Comput. Syst., 10, 1, article 11, 11:1–11:29, 2010.

46. O. Chipara, C. Lu, T. C. Bailey, and G.-C. Roman, Reliable patient monitoring: a clinical study in a step-down hospital unit, Department of Computer Science and Engineering, Washington University St. Louis, St. Louis, MO, Technical Report WUCSE-2009-82, December 2009.

47. B. Chen, K. K. Muniswamy-Reddy, and M. Welsh, Ad-hoc multicast routing on resource-limited sen-sor nodes, In Proceedings of the 2nd International Workshop on Multi-Hop Ad Hoc Networks: From Theory to Reality, Florence, Italy, pp. 87–94, 2006.

48. K. Lorincz and M. Welsh, MoteTrack: a robust, decentralized approach to RF-based location tracking, Pers. Ubiquit. Comput., 11, 6, pp. 489–503, 2007.

49. R. Jain, Survey paper: medical applications of wireless networks, This paper is available on-line at http://www.cse.wustl.edu/~jain/, Accessed in December 2011.

50. O. Gnawali, R. Fonseca, K. Jamieson, D. Moss, and P. Levis, Collection tree protocol, In Proceedings of the 7th ACM Conference On Embedded Network Sensor Systems, Berkeley, CA, pp. 1–14, 2009.

51. K. Laerhoven, B. Lo, J. Ng, S. Thiemjarus, R. King, S. Kwan, H. Gellersen, M. Sloman, O. Wells, P. Needham, N. Peters, A. Darzi, C. Toumazou, and G. Yang, Medical healthcare monitoring with wear-able and implantable sensors, In UbiHealth 2004: 3rd International Workshop on Ubiquitous Computing for Pervasive Healthcare Applications, Nottingham, England, 2004.

52. U. Maurer, A. Rowe, A. Smailagic, and D. Siewiorek, eWatch: a wearable sensor and notification plat-form, In The International Workshop on Wearable and Implantable Body Sensor Networks (BSN 2006), Cambridge, MA, USA, 2006.

53. A. Pentland, Healthwear: medical technology becomes wearable, Computer, 37, 5, 42–49, May 2004. 54. A. Salarian, H. Russmann, F. J. G. Vingerhoets, C. Dehollain, Y. Blanc, P. R. Burkhard, and K. Aminian,

Gait assessment in Parkinson’s disease: toward an ambulatory system for long-term monitoring, IEEE Trans. Biomed. Eng., 51, 8, 1434–1443, August 2004.

Please provide location.

Please provide author, publisher, and publisher location.

K15146_C023.indd 709 4/19/2013 6:16:05 AM

Page 34: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

710 ◾ Wireless Sensor Networks

55. M. Visintin, H. Barbeau, N. Korner-Bitensky, and N. E. Mayo, A new approach to retrain gait in stroke patients through body weight support and treadmill stimulation, Stroke, 29, 6, 1122–1128, June 1998.

56. J. He, H. Li, and J. Tan, Real-time daily activity classification with wireless sensor networks using hid-den Markov model, In Proceedings of the 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 3192–3195, August 2007.

57. E. Miluzzo, N. D. Lane, K. Fodor, R. Peterson, H. Lu, M. Musolesi, S. B. Eisenman, X. Zheng, and A. T. Campbell, Sensing meets mobile social networks: the design, implementation and evaluation of the CenceMe application, In Proceedings of the 6th ACM Conference on Embedded Network Sensor Systems, Raleigh, NC, pp. 337–350, 2008.

58. A. Ahmadi, D. D. Rowlands, and D. A. James, Investigating the translational and rotational motion of the swing using accelerometers for athlete skill assessment, In Proceedings of the 5th IEEE Conference on Sensors, pp. 980–983, October 2006.

59. F. Michahelles and B. Schiele, Sensing and monitoring professional skiers, IEEE Pervasive Comput., 4, 3, 40–46, July–September 2005.

60. K. Lorincz, B. Chen, G. W. Challen, A. R. Chowdhury, S. Patel, P. Bonato, and M. Welsh, Mercury: a wearable sensor network platform for high-fidelity motion analysis, In Proceedings of the 7th ACM Conference on Embedded Network Sensor Systems, pp. 353–366, 2009.

61. S. Patel, K. Lorincz, R. Hughes, N. Huggins, J. H. Growdon, M. Welsh, and P. Bonato, Analysis of fea-ture space for monitoring persons with Parkinson’s disease with application to a wireless wearable sensor system, In Proceedings of the 29th IEEE Engineering in Medicine and Biology Society Annual International Conference, pp. 6290–6293, August 2007.

62. R. Ganti, P. Jayachandran, T. Abdelzaher, and J. Stankovic, SATIRE: a software architecture for smart AtTIRE, In Proceedings of the 4th International Conference on Mobile Systems. Applications, and Services, Uppsala, Sweden, pp. 110–123, June 2006.

63. Crossbow Corporation. MICAz specifications. [Online]. Available: http://www.xbow.com/Support/Support_pdf_files/MPR-MIB_Series_Users_Manual.pdf.

64. M. Soini, J. Nummela, P. Oksa, L. Ukkonen, and L. Sydänheimo, Wireless body area network for hip rehabilitation system, UbiCC J.

65. Mobihealth project, URL: http://www.mobihealth.org, Accessed in December 2011. 66. P. Neves, M. Stachyra, and J. Rodrigues, Application of wireless sensor networks to healthcare promo-

tion J. Commun. Softw. Syst., 4, 3, 1845–6421/08/8143, September 2008. 67. G. Mulligan and L. W. Group, The LoWPAN architecture, In The 4th Workshop on Embedded Networked

Sensor, Cork, Ireland, 2007. 68. J.G. Ko, C. Lu, M. B. Srivastava, J. A. Stankovic, A. Terzis, and M. Welsh, Wireless sensor networks for

healthcare, Proc. IEEE, 98, 11, November 2010. 69. S. Kumar, Autosense, NIH GEI project at The University of Memphis, 2007. [Online]. Available:

http://www.cs.memphis.edu/~santosh/AutoSense.html. 70. T. Das, P. Mohan, V. N. Padmanabhan, R. Ramjee, and A. Sharma, PRISM: platform for remote

sensing using mobile smartphones, In Proceedings of the 8th International Conference on Mobile Systems, Applications, and Services, San Francisco, pp. 63–76, June 2010.

71. P. Khan, M. A. Hussain, and K. S. Kwak, Medical applications of wireless body area networks, JDCTA 3, 185–193, 2009.

72. J. A. Stankovic, Q. Cao, T. Doan, L. Fang, Z. He, R. Kiran, S. Lin, S. Son, R. Stoleru, and A. Wood, Wireless sensor networks for in-home healthcare: potential and challenges, In High Confidence Medical Device Software and Systems Workshop, Pennsylvania, USA, 2005.

73. L. Schwiebert, S. K. S. Gupta, and J. Weinmann, Research challenges in wireless networks of biomedi-cal sensors, In The 7th Annual International Conference on Mobile Computing and Networking, Rome, Italy, pp. 151–165, 2001.

74. B. Krishnamachari, Networking Wireless Sensors, Cambridge University Press, 2006. 75. J. Yoo, L. Yan, S. Lee, Y. Kim, and H. Yoo, A 5.2 mw self-configured wearable body sensor network

controller and a 12 w wirelessly powered sensor for a continuous health monitoring system, IEEE J. Solid-State Circ. 45, 178–188, 2010.

Please provide accessed date.Please provide year of publica-tion, volume and page numbers.

Please provide page numbers.

Please provide accessed date.

Please provide publisher loca-tion.

K15146_C023.indd 710 4/19/2013 6:16:05 AM

Page 35: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

Artificial Eye Vision Using Wireless Sensor Networks ◾ 711

76. T. Gao, D. Greenspan, M. Welsh, R. Juang, and A. Alm, Vital signs monitoring and patient tracking over a wireless network, In Proceedings of the 27th Annual International Conference of the IEEE EMBS, Shanghai, China, pp. 102–105, 1–4 September 2005.

77. M. Renaud, K. Karakaya, T. Sterken, P. Fiorini, C. Hoof, and R. Puers, Fabrication, modelling and characterization of MEMS piezoelectric vibration harvesters, Sens. Actuat. A, 145–146, 380–386, 2008.

78. C. Lauterbach, M. Strasser, S. Jung, and W. Weber, Smart clothes self-powered by body heat, In Proceedings of Avantex Symposium, Frankfurt, Germany, pp. 5259–5263, May 2002.

79. S. Cherukuri, K. Venkatasubramanian, and S. Gupta BioSec: a biometric based approach for securing communication in wireless networks of biosensors implanted in the human body, In Proceedings of the 2003 International Conference on Parallel Processing Workshops (ICPPW’03), 1530-2016/03 $ 17.00 © 2003 IEEE.

80. J. Misic, Enforcing patient privacy in healthcare WSNs using ECC implemented on 802.15.4 bea-con enabled clusters, In Proceedings of the Sixth Annual IEEE International Conference on Pervasive Computing and Communications, Hong Kong, March 2008; pp. 686–691.

81. G. Zhang, C. Poon, Y. Li, and Y. Zhang, A biometric method to secure telemedicine systems, In Proceedings of the 31st Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Minneapolis, MN, USA, pp. 701–704, September 2009.

82. S. Bao, Y. Zhang, and L. Shen, Physiological signal based entity authentication for body area sensor networks and mobile healthcare systems, In Proceedings of the 27th Annual International Conference of the IEEE EMBS, Shanghai, China, pp. 2455–2458, 1–4 September 2005.

83. T. Hori, Y. Nishida, T. Suehiro, and S. Hirai, SELF-Network: design and implementation of network for distributed embedded sensors, In Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Takamatsu, Japan, pp. 1373–1378, 30 October–5 November 2000.

84. Y. Mori, M. Yamauchi, and K. Kaneko, Design and implementation of the vital sign box for home healthcare, In Proceedings of IEEE EMBS International Conference on Information Technology Applications in Biomedicine, Arlington, VA, USA, pp. 104–109, November 2000.

85. M. Gietzelt, K. Wolf, M. Marschollek, and R. Haux, Automatic self-calibration of body worn triaxial-accelerometers for application in healthcare, In Proceedings of the Second International Conference on Pervasive Computing Technologies for Healthcare, Tampere, Finland, pp. 177–180, January 2008.

86. E. Biagioni and G. Sasaki, Wireless sensor placement for reliable and efficient data collection, In Proceedings of the Hawaii International Conference on Systems Sciences, January 2003.

87. Base URL: http://artificialretina.energy.gov, Accessed January 10, 2012. 88. National Alliance for Eye and Vision Research, www.eyeresearch.org. 89. W. Dobelle, Retinal and cortical chip implants: artificial vision, In Eye and Chip World Conference on

Artificial Vision held at Lawrence Tech University in Southfield Michigan, June 2002. http://www.wayfinding.net/iibnNECtextchip.htm.

90. M. K. Watfa, Practical applications and connectivity algorithms in future wireless sensor networks, Int. J. Inform. Commun. Eng. 4, 5, 2008.

91. C. Veraart, F. Duret, M. Brelen, and J. Delbeke, Vision rehabilitation with the optic nerve visual pros-thesis, In Proceedings of the 26th Annual International Conference of the IEEE EMBS, San Francisco, USA, pp. 4163–4165, September 2004.

92. I. Maric and R. Yates, Efficient multihop broadcast for wideband systems, In Proceedings of DIMACS Workshop on Signal Processing for Wireless Transmission, Piscataway, pp. 285–299, October 2002.

93. A. Sendonaris, E. Erkip, and B. Aazhang, Increasing uplink capacity via user cooperation diversity, In Proceedings of IEEE International Symposium on Information Theory, p. 156, 1998.

94. A. Sendonaris, E. Erkip, and B. Aazhang, User cooperation – Part I: System description and User coop-eration – Part II: Implementation aspects and performance analysis, IEEE Trans. Commun., 51, 11, pp. 1927–1938, November 2003.

95. J. Laneman, D. Tse, and G. Wornell, Cooperative diversity in wireless networks: efficient protocols and outage behavior, IEEE Trans. Inform. Theory, 50, 12, pp. 3062–3080, December 2004.

96. https://www.stanford.edu/group/brainsinsilicon/vision.html, Accessed December 17, 2011.

Please provide location.

Please provide location.

Please provide accessed date.

Please provide location.

K15146_C023.indd 711 4/19/2013 6:16:06 AM

Page 36: Artificial Eye Vision Using Wireless Sensor Networks...Artificial Eye Vision Using Wireless Sensor Networks 679 application. This vision can only be applied through the use of common

712 ◾ Wireless Sensor Networks

Related Web Sites for More InformationAMD Alliance www.amdalliance.org.American Council of the Blind www.acb.org.Biomimetic MicroElectronic Systems bmes-erc.usc.edu.DOE Artificial Retina Project ArtifcialRetina.energy.gov.Doheny Eye Institute www.usc.edu/hsc/doheny/.Foundation Fighting Blindness www.blindness.org.National Alliance for Eye and Vision Research www.eyeresearch.org.Prevent Blindness America www.preventblindness.org.Second Sight Medical Products www.2-sight.com.

K15146_C023.indd 712 4/19/2013 6:16:06 AM