14
Research Article An Implementation Approach and Performance Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md. Shahjalal , Md. Tanvir Hossan , Moh. Khalid Hasan , Mostafa Zaman Chowdhury , Nam Tuan Le , and Yeong Min Jang Department of Electronics Engineering, Kookmin University, Seoul, Republic of Korea Correspondence should be addressed to Yeong Min Jang; [email protected] Received 30 April 2018; Revised 10 September 2018; Accepted 4 October 2018; Published 22 October 2018 Academic Editor: Tony T. Luo Copyright © 2018 Md. Shahjalal et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Optical camera communication (OCC) exhibits considerable importance nowadays in various indoor camera based services such as smart home and robot-based automation. An android smart phone camera that is mounted on a mobile robot (MR) offers a uniform communication distance when the camera remains at the same level that can reduce the communication error rate. Indoor mobile robot navigation (MRN) is considered to be a promising OCC application in which the white light emitting diodes (LEDs) and an MR camera are used as transmitters and receiver, respectively. Positioning is a key issue in MRN systems in terms of accuracy, data rate, and distance. We propose an indoor navigation and positioning combined algorithm and further evaluate its performance. An android application is developed to support data acquisition from multiple simultaneous transmitter links. Experimentally, we received data from four links which are required to ensure a higher positioning accuracy. 1. Introduction Currently, optical wireless communication (OWC) is exten- sively used to mitigate the data traffic from mobile commu- nications among which the ultraviolet (UV), infrared (IR), or visible light (VL) spectrums are used as the propagation mediums. e VL spectrum is extensively used by OCC, light fidelity (Li-Fi), and visible light communication (VLC) [1, 2]. OCC is becoming a promising technology in terms of localization and navigation because of global positioning system (GPS) using radio frequency (RF) based wireless networks such as wireless fidelity (Wi-Fi), ultra wideband (UWB), Bluetooth, and wireless local area network (WLAN) which are not much accurate location based communication services. Also GPS works well only in the case of outdoor applications, such as tracking and robot navigation, and it is considerably difficult to estimate the accurate locations using RF technology in an indoor environment. WLAN based localization schemes, such as AOA, TDOA, and WLAN fingerprint, have several challenges as it was developed initially only for wireless networking. In the AOA and TDOA approaches the users have to know the locations of access points (APs) and are not able to provide maximum network coverage. Also the localization errors are high for the NLOS cases [3]. Although fingerprint method which uses power of the received signal has recently captured lots of attention, it faces several challenges. Specific RSS measurement is not possible for individual APs due to dense deployment. Moreover, surveying stage is more time consuming and records the RSS for a time period which increases the complexity of the system. Also RSS weakens in case of NLOS propagation due to the presence of obstacles like walls, furnitures, and doors [4]. In addition, if there are many WLANs in a typical region this would cause extra interference between each other, whereas, OCC is a new technology that offers low cost, simple and low latency communication, and higher positioning accuracy. We used smartphone based localization technique using OCC where image is processed by controlling the camera properties so that other lights cannot make any interference among each other. As each Hindawi Wireless Communications and Mobile Computing Volume 2018, Article ID 7680780, 13 pages https://doi.org/10.1155/2018/7680780

An Implementation Approach and Performance …...Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md.Shahjalal ,Md.TanvirHossan ,Moh.KhalidHasan

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: An Implementation Approach and Performance …...Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md.Shahjalal ,Md.TanvirHossan ,Moh.KhalidHasan

Research ArticleAn Implementation Approach and PerformanceAnalysis of Image Sensor Based Multilateral Indoor Localizationand Navigation System

Md Shahjalal Md Tanvir Hossan Moh Khalid Hasan Mostafa Zaman Chowdhury Nam Tuan Le and Yeong Min Jang

Department of Electronics Engineering Kookmin University Seoul Republic of Korea

Correspondence should be addressed to Yeong Min Jang yjangkookminackr

Received 30 April 2018 Revised 10 September 2018 Accepted 4 October 2018 Published 22 October 2018

Academic Editor Tony T Luo

Copyright copy 2018 Md Shahjalal et al This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

Optical camera communication (OCC) exhibits considerable importance nowadays in various indoor camera based services suchas smart home and robot-based automation An android smart phone camera that is mounted on a mobile robot (MR) offersa uniform communication distance when the camera remains at the same level that can reduce the communication error rateIndoor mobile robot navigation (MRN) is considered to be a promising OCC application in which the white light emitting diodes(LEDs) and an MR camera are used as transmitters and receiver respectively Positioning is a key issue in MRN systems in termsof accuracy data rate and distance We propose an indoor navigation and positioning combined algorithm and further evaluateits performance An android application is developed to support data acquisition from multiple simultaneous transmitter linksExperimentally we received data from four links which are required to ensure a higher positioning accuracy

1 Introduction

Currently optical wireless communication (OWC) is exten-sively used to mitigate the data traffic from mobile commu-nications among which the ultraviolet (UV) infrared (IR)or visible light (VL) spectrums are used as the propagationmediums The VL spectrum is extensively used by OCClight fidelity (Li-Fi) and visible light communication (VLC)[1 2] OCC is becoming a promising technology in termsof localization and navigation because of global positioningsystem (GPS) using radio frequency (RF) based wirelessnetworks such as wireless fidelity (Wi-Fi) ultra wideband(UWB) Bluetooth and wireless local area network (WLAN)which are not much accurate location based communicationservices Also GPS works well only in the case of outdoorapplications such as tracking and robot navigation and itis considerably difficult to estimate the accurate locationsusing RF technology in an indoor environment WLANbased localization schemes such asAOATDOA andWLANfingerprint have several challenges as it was developed

initially only for wireless networking In the AOA and TDOAapproaches the users have to know the locations of accesspoints (APs) and are not able to provide maximum networkcoverage Also the localization errors are high for the NLOScases [3] Although fingerprint method which uses power ofthe received signal has recently captured lots of attentionit faces several challenges Specific RSS measurement isnot possible for individual APs due to dense deploymentMoreover surveying stage is more time consuming andrecords the RSS for a time period which increases thecomplexity of the system Also RSS weakens in case ofNLOS propagation due to the presence of obstacles like wallsfurnitures and doors [4] In addition if there are manyWLANs in a typical region thiswould cause extra interferencebetween each other whereas OCC is a new technologythat offers low cost simple and low latency communicationand higher positioning accuracy We used smartphone basedlocalization technique using OCC where image is processedby controlling the camera properties so that other lightscannot make any interference among each other As each

HindawiWireless Communications and Mobile ComputingVolume 2018 Article ID 7680780 13 pageshttpsdoiorg10115520187680780

2 Wireless Communications and Mobile Computing

room contains LED light so there is no possibility that objectswill make any interruption during the LOS communication

OCC uses an unlicensed spectrum in which the LEDinfrastructures are used as the transmitters LED is anelectronic light source in which illumination intensity canbe controlled at a high frequency using an externally drivencircuit LED is a useful transmitter because it is energyefficient has common indoor lighting infrastructures andis cheap additionally a highly accurate data transmissionis possible due to the variable luminary properties of LEDDigital data is transmitted through the LED light by varyingthe properties of LED according to the different modulationschemes Typically for low frame rate (25sim30 fps) commer-cial cameras undersamplingmodulation techniques are usedThis technique includes phase-shift on-off keying (PSOOK)[5] frequency-shift on-off keying (FSOOK) [6] and m-arraypulse amplitude modulation (PAM) [7]

A color camera which is used as the receiver for OCCapplications typically comprises an image sensor (IS) lensand Bayer filter A camera exhibits several benefits over aphoto diode (PD) in terms of the extended field of view(FOV) and because of the fact that the pixels of the IS canreceive light fromvarious directions thereby providing a highsignal to noise ratio (SNR) Localization is possible at thecm-level of accuracy in an indoor environment under stableillumination from an external light source Several reports onpositioning using a VLC system have already been proposedIn one report a VLC-based positioning system in whichthe LEDs wirelessly transmit the location information hasbeen proposed [8] The location information signals werereceived by a PD and the spatial distribution was measuredat each reference point Further the information from peopleor objects can be obtained using map positioning There aresome localization or navigation services using OCC exhibitan external cloud server to store the mapping data and tocompute the position at which the camera is connected toserver via Wi-Fi or Bluetooth during location estimation [9]This approach exhibits a few drawbacks because it requiresan external server and an additional RF network which isexpensive and time consuming

For localizing a rolling shutter (RS) camera that wasmounted on an MR the location information should bereceived using the camera from multiple LEDs We proposean android phone mounted mobile robot positioning (MRP)approach using multiple frequency-shift keying (MFSK) inwhich we are able to obtain four locations ID from fourdifferent LEDs simultaneously We enhance distance mea-surement technique that allows maximum 2 cm error at100 cm horizontal distance in case of single LED insidethe FOV when camera and LED are not even in verticalline-of-sight (LOS) An MRN algorithm is also proposed tonavigate MR to different location Relevant work on indoorlocalization is described in Section 2 Section 3 describes theoverall scenario of the indoorMRP andMRN systems as wellas the transmitter characteristics A brief description of theandroid application and the openCV libraries rolling shuttereffects and MR specifications are provided in Section 4The proposed algorithm and the remaining operationalcharacteristics such as the camera FOV and the exploration

of ID distribution systems are described in Section 5 InSection 6 the details of the accuracy enhancement techniqueare provided A demonstration scenario and results arepresented in Section 7 Finally this research work concludesin Section 8

2 Related Works

Currently camera-bearing mobile phones are commonlyused and are important commodities in e-commerce There-fore location based services (LBS) have become an importantissue To ensure better performance in LBS it is mandatoryto measure the location of these mobile devices accuratelyespecially indoors

GPS is a pseudo-lite system which faces challengesat the point of indoor localization This system is a LOSbased localization solution which accumulates the sensorinformation from satellites that are located at a considerabledistance (20000 km) from the ground The signals of thesatellites are interrupted by the obstacles on the ground suchas trees and buildings due to the LOS channel characteristicand the considerable distance between satellites and sensorsTherefore a considerably large modification is required tomake the GPS system to be suitable for indoor localizationFor example the integration of GPS signals with the indoortransmitting antennas has been reported to localize sensornodes [10] however it is not cost effective and the localiza-tion error is observed to be 1 m in delay of 10 sec

The distinctness and novel characteristics of OWC-basedtechniques are considered important supporting candidateover existing solutions for indoor localization and navigationscenarios Several approaches of VL-based localization andnavigation schemes for mobiles PDs or wearable computershave been reported [11ndash17] There is still considerable debateabout the value of localization resolution A lower value oflocalization resolution was observed from the simulationresults of various conditional approaches An importantsubpart of OWC is OCC in which the camera receives asignal from themodulated light beam that was observed froman LED An IS detects the intensity of the metameric LEDswithin a fixed indoor environment [18] The localizationperformance exhibits a 1-degree orientation measurementand the calculated resolution was observed to be 15 and 358cm for 2D and 3D space respectively Without measuringthe angular displacement 0001-m localization resolution hasbeen observed [19] In a similar approach the localizationusing an LED transmitter and a camera as the receiver hasbeen discussed by other groups [20 21] Additionally theinformation from an accelerometer sensor has been includedfurther the demodulated data from the IS could be used toimprove the overall performance in a 3D indoor environment[22]

Popular methods for localization can be used to measurethe signal strength of the receiver ie a photodiode orcamera This signal can be either visible light or RF An RSS-based localization scheme for an aperture-based receiverwhich exhibits a wide FOV and a better angular diversityhas been proposed [23] They derived the Cramer-Rao lowerbound of position estimation for improving the overall

Wireless Communications and Mobile Computing 3

Case I

4LEDs withinFoV

Case II

z

y

xWorld co-ordinate

4 LEDs withinFoV

2 LEDs withinFoV

Case III

Position 1

Position 2Position 3

Opticalcommunication

links

(a)

Arduino High-speed switchingDC 12 V

LED

(b)

Figure 1 A scenario of the proposed model (a) MRP and MRN system using OCC and (b) transmitter set

performance Meanwhile a generic framework of RSS hasbeen introduced to enrich the positioning performance [24]Alternatively the combination of spatiotemporal constraintsin RSS fingerprint-based localization has achieved the samepurpose [25] Compressive sensing has been applied toimprove the localization performance in noisy scenarios [26]The implication of the RSS approach was also observed in theVL-based positioning schemes [27ndash29]Here the localizationresolution was observed to be 4 m [30] Importantly anartificial intelligence algorithm that was applied to collectthe location-labeled fingerprints was reported for optimizingthe overall performance [31 32] Although the positioningaccuracy that was obtained for the RSS fingerprints wasobserved to be 80 [33] a 3D space fingerprint requiredconsiderable overhead for positioning a high speed objectas compared with that obtained in a simpler 2D spacefingerprint

Time-of-arrival (TOA) and time-difference-of-arrival(TDOA) are two methods that have been used several timesto provide a solution for localization in indoor environmentsA TDOA-based localization scheme has been proposed inwhich a light beam with a unique frequency was transmit-ted in different phases [34ndash36] A TOA-based localizationalgorithm has also been developed [37] The most importantissue is that the TOA- and TDOA-based localization schemesare cost-effective and accurate Furthermore these schemesdepend on the location information from a central nodeas well as from other reference nodes in the same indoorspace An extendedKalman filter-basedTDOAapproach thatignores the impact of this dependency on the reference nodeinformation has also been proposed [38]However deployingsuch a tracking algorithm is not always advantageous becausethe extended Kalman filter failed to accurately estimate theposition as a first-order approximation

Angle-of-arrival (AOA) is another method that can beapplied for indoor localization In this approach the receiverestimates the angle of the transmitted signal by calculatingthe time difference of arrival from the individual trans-mitter Transmitting the gain difference from the LED hasbeen considered by AOA for indoor localization [30 3940] Simulation results depicted that the average value for

localization resolution was 35 cm However in our proposedsystem it is not essential to gather the angle information fromLEDs to measure the position of the camera The positionis calculated with the help of distance comparison amongseveral LEDs (eg more than two LEDs) These distancesare calculated using the photogrammetry technique [41]where the occupied image area of the LEDs on the imagesensor varies with the distance Furthermore AOA exhibitssome disadvantages eg the accuracy degrades with theincreasing distance between the transmitter and receiver andthe reflections of the signal from the multipath add someerror during the location measurement and require large andcomplex hardwire devices and the shadowing and directivityof the measuring aperture have significant effect on theoverall performance of location estimation

3 Indoor Transmitter Scenario

Figure 1(a) depicts a scenario of the proposed indoor trans-mitter and receiver scenario for MR positioning and naviga-tion system using OCC A circular shaped white LED havinga diameter of 95 cm was used as the transmitter Becausethe LED exhibits an altering luminary property it can flickerat ultrahigh speeds that are beyond the perception of thehuman eye The indoor location ID is modulated with a highmodulation frequency and encodedwith additional bits usingan LED driver circuit This encoded signal is further fed tothe LED which continuously transmits this ID using theflickering light The driver circuit comprises a 12-V dc powersupply an Arduino board and a high speed switching deviceincluding metal-oxide-semiconductor field-effect transistor(MOSFET) The transmitter set with an LED driver circuitis depicted in Figure 1(b) LEDs are installed on the ceilingand the location number is transmitted to the MR receivercamera In our system transmitters are arranged in a square-shaped regular pattern in which each side of LEDs is 50 cmapart from each other In this case the positioning system ofthe camera is mounted on top of theMR whichmaintains anequal height along the z-axis Therefore the LEDs transmitthe modulated data bits that contain only the (x y) locationinformation

4 Wireless Communications and Mobile Computing

For bit lsquo0rsquo

For bit lsquo1rsquo 500 us (2 kHz)

250 us

Data bits

Bit definition lsquo0 rarr lsquo0 0lsquo1 rarr lsquo0 1

Starting bits data bits

Bi-phase Manchester encoding

(4 kHz)

1 1 0 1 1

1 1 0 01 1 0 0 0 1 0 1 1 1 0 1

Figure 2 Transmitted data bit mechanisms

The visual flickering in an optical wireless communica-tion is a major problem that can be defined as the changein the illumination of the LED when it transmits a binarylsquo1rsquo lsquo0rsquo code-word in terms of the light that is perceived bythe human eyes Generally human eyes can avoid flickeringif the data is modulated with a frequency that is greaterthan 200 Hz [42] The LED transmitter should be drivenusing the accurate LED driver circuit to avoid any noticeableflicker Multiple frequency-shift keying (MFSK) is used asthe modulation technique to generate the modulated binarycode-words using the LED driver circuit Figure 2 depicts thecharacteristics of the transmitted bit pattern Our proposedstudy uses the modulation frequencies 2 and 4 kHz whichare safe enough to avoid flickering Further the modulatedsignal can be encoded using bi-phaseManchester line codingEach of the binary bits 1 and 0 is defined using a differentcode Bit lsquo1rsquo is defined as lsquo01rsquo whereas bit lsquo0rsquo is definedas lsquo00rsquo We transmitted five bits of data from each LEDTherefore in a single indoor system 32 location IDs canbe provided to 32 LEDs Furthermore if the actual datais lsquo11011rsquo the encoded signal will be lsquo0101000101rsquo after bi-phaseManchester line coding as shown in Figure 2 Althoughthere are 3 consecutive lsquo0rsquos in this example the bi-phaseManchester encoding eliminates the possibility of visualflickering because it confirms one transition in the middle ofeach bit As this code stream is continuously transmitted astarting bit symbol lsquo11rsquo is added at the beginning of the code-word to differentiate between every simultaneous code-word

4 Receiver Characteristics

41 Android Application Background An application is de-signed for the android studio platform using the camera2 hardware package which is added in API level 21 Someandroid features are described in this study that are usedto create the camera device and process the requests Onerequest acts to capture a single frame and output a set of imagebuffers for the request A cameraManager is required whenmultiple requests are in queue to maintain the full frame rateTo accumulate the captured frame data a getCameraCharac-teristics class is usedThe image data is encapsulated in Imageobjects and can be directly accessed through a class named

ImageReader using a YUV 420 888 formatThe image data isrendered onto a surface with defined size For processing thecaptured image frames openCV320 libraries were importedinto the application Initially the camera shutter speed iscontrolled to focus only to the white LEDs and to detect onlythe region of interest (RoI)

42 Rolling Shutter Effect Currently most of the consumercameras contain complementary metalndashoxidendashsemicon-ductor (CMOS) sensors which include rolling shuttermechanisms The sequential read-out technique is a keyfeature of the rolling shutter camera in which each frame isnot captured at exactly the same instant [43] The scene iscaptured by scanning rapidly either horizontally or verticallyunlike a global shutter camera in which the complete scene iscaptured at the same time In a rolling shutter camera eachrow of the pixels is exposed at once at the exposure timeTheread-out time protects the rows of pixels from overlappingIn a single captured image the rolling shutter allows multipleexposures Therefore for an LED which flickers on-offaccording to the modulated binary bit stream the capturedimage contains a bunch of dark and white strips The widthof the strips depends on the modulation frequencies and thenumber of strips depends on the distance

Although a different camera receives signals of similarfrequencies from the same transmitter the width of the stripsis different as the specifications of the camera sensorsmay dif-fer fromdevice to device In our system amultiple frequency-modulated signal is transmitted and the dark andwhite stripsare received at different distances as depicted in Figure 3Thecamera captured the pictures with a fixed clockwise rotationof ImageView of 270∘ As the distance increases the size ofthe LED in the IS decreases further the number of stripsalso decreases because the width remains identical for fixedfrequencies at any distance for a particular camera

43 Introduction to MR Functionalities Localization andnavigation are challenging issues in optical camera communi-cation for the existingMRsThe initial requirement is to con-nect to the input interface of theMR to provide necessary datafrom the android phone such as the location ID and its dis-tribution system When a camera captures the rolling imagesfrom multiple LEDs it processes the images and obtains thelocations thereby making a decision about the location dis-tribution system and then feeds the data to the MR The MRmust exhibit a user input interface (UI) to set a target locationID a data base (DB) to store the ID distribution informationand a display screen to exhibit the location information andthe subsequent direction to move after making a decisionFigure 4 depicts the specifications of the MR and its requiredfunctionalities It also exhibits a data and image-processingunit to compare the IDs and a database to store the previoustracking information The MR receives the IDs and cantherefore localize itself It also has a capability to compare thestored IDs and navigate by itself if the user sets a destination

5 Operational Principle

51 Navigation Algorithm The objective of the MRN systemis to navigate an MR by receiving the locations or location

Wireless Communications and Mobile Computing 5

(a) 100 cm (b) 250 cm

(c) 480 cm

Figure 3 Experimentally the number of dark and white strips decreases with the increase in distance (a) 52 strips at a distance of 100 cm(b) 28 strips at a distance of 250 cm (c) 5 strips at a distance of 480 cm

Android phonecamera

ID distributionsystem Coordinate ID

MRPUDB UI

Display screen

Location info Direction tonavigate

MR Unit

Imageprocessing

Smart Phone Unit

Figure 4 Block diagram of the mobile robot functionalities

ID using multiple LED lights Figure 5 describes a flowchartof the applied navigation algorithm in our proposed systemA user inputs a target location into the smart phone that is

mounted on the top of an indoor MR Because the completeenvironment in the room is unknown to an MR it shouldfix its position first which indicates that a location shouldbe observed inside the room Therefore it estimates thenumber of LEDs that arewithin the camerarsquos FOVTo performnavigation theMR should perceive the direction inwhich thelocation values are changingThus aminimumof three LEDsshould be within the camerarsquos FOV as depicted in Figure 1(a)in previous section Here in case I the IS can only detecttwo LEDs among which each LED exhibits the same valuefor the x location further it is difficult to observe a variationin the value of the x location In cases II and III there arefour LEDs within the camerarsquos FOV as MR is situated deepinside the room If three or more LEDs are detected using theIS the smart phone receives the LED locations and comparesthem with the target location If the target location matchesany of the received locations the MR moves in a horizontaldistance toward the LED that is located directly below thetarget location

As mentioned in the previous paragraph to localizethe MR to a floor position of a target LED we need tomeasure the horizontal distance from the MR to that floorposition Here floor position of an LED means the positionon the floor which is just below the LED Suppose the targetLED is transmitting (x y) location ID When the MR islocated in its initial position as depicted in Figure 6 theimplemented android smartphone app calculates the directdistance D from the ceiling LED The direction towards theLED is predicted from the image formation on the IS (imageformation technique on IS is well described in [42]) After

6 Wireless Communications and Mobile Computing

Detect the LEDs withincamera FoV and define as N

Get LED co-ordinates

Measure horizontaldistance to that

LED

Pass anydistance

N ge 3

Compare co-ordinates with thetarget co-ordinate

Find the nearest LEDco-ordinate to gotarget co-ordinate

Pass thatdistance

Measure thehorizontal distance to

that LED

Are the distancesame

Set target co-ordinate to MR

Pass theshortestdistance

Navigationterminated

Target co-Ordinate found

Pass that distance

Measure thehorizontal distance

to both LED

Get LEDco-ordinates

Target co-ordinate found

Yes

No

Yes

No

Yes

Yes

No

No

Figure 5 The mobile robot navigation flowchart

obtaining the direct distance we can easily find the horizontaldistance s using Pythagoras distance formula for right angletriangle as the vertical distance is fixed inside a room Aftercalculating the value of s and understanding the directionto move MR navigates this distance and fixes its locationto (x y) location If the target location is not matched withany of the received location ID the MR estimates the nearestLED that is required to reach the destination Subsequentlythe MR traverses a certain amount of distance to go apoint that is exactly below that particular LED Furtherthe MR again receives the locations of the LEDs that arewithin camerarsquos FOV Subsequently the procedure continuesuntil the destination is reached If the camera detects twoLEDs within its FOV the MR cannot detect the locationdistribution systemTherefore it initially chooses one ID andgoes to a point that is located exactly below that LED andthen MR continues with the aforementioned procedure

52 Camera Field of View We considered the indoor envi-ronment to possess a uniform square distribution of LEDsFigure 7 depicts the manner in which the number of LEDswithin camerarsquos FOV depends on different factors such asthe height of the room the distance between two adjacentLEDs and the angle of view (AOV) ie the FOV angle Forthe purpose of navigation the MR should understand thelocation distribution system among the LEDs It is obvious

that at least two diagonally located LED locations should beconsidered to make a decision about the direction in whichthe X and Y locations are increasing or decreasing A relationhas been derived to estimate the number of LEDs that werecaptured within the FOV at any time considering the cameraview angle the height of the roof and the surface area takenby a single LED

The horizontal and vertical FOV can be estimated usingthe following two equations

120601ℎ = 2tanminus1 119889ℎ2119891 (1)

120601V = 2tanminus1 119889V2119891 (2)

where 120601V and 120601ℎ are the FOV in vertical and horizontaldirections and 119889ℎ 119889V are the sensor dimensions respectively119891 represents the focal length of the camera lens Figure 7shows the camera FOV and its area is projected vertically onthe roof of height ℎ119903 We can represent the area of the FOVusing the following equation

119860119865119874119881 = 4ℎ1199032 tan (120601V) tan (120601ℎ) (3)Because the LEDs are distributed in a square pattern and

each LED separated by a distance 119886 along each side (where119886 gt LED diameter) then each LED will occupy an area of 1198862

Wireless Communications and Mobile Computing 7

TransmittedSignals

Cameramounted MR

Final position (xy)

D

s

h

sInitial position

LED(xy)

Figure 6 Calculation procedure of the distance to move horizon-tally

FOV

Camera

ℎr

2ℎr

tan(

)

2ℎr tan(

ℎ )

Figure 7 Analysis of the camerarsquos FOV

Now we can find the number of LEDs will be in thecamera FOV using (3)

119873119871119864119863 = 4ℎ1199032 tan (120601V) tan (120601ℎ)1198862 (4)

53 Exploring the ID Distribution System The smart phonethat was used to perform this study was a Samsung S7 edgewhich contained a 26-mm camera and a focal length of42 mm The height of the room that was observed usingthe camera was 256 m We set up the android camerawith a height and width of 600 and 800 px respectivelyIn a general scenario there were always at least four LEDswithin the camerarsquos FOV under this type of considerationTo define the coordination distribution system we divided

LF region FR region

RB regionBL region

(89) (8292)

(89) (8292)

Figure 8 Region basis locations selection and exploring theirdistribution system

the IS into four regions ie left-front region (LFR) front-right region (FRR) right-back region (RBR) and back-leftregion (BLR) as depicted in Figure 8 where each regioncontains an equal number of pixels The camera will detect asingle LED per region and receive the location ID as well Wedefined the locations on a regional basis which are (XLFYLF)(XFRYFR) (XRBYRB) and (XBLYBL) respectively within thefour regions It is obvious that the LEDs will be present eitherin both LF and RB regions or in both FR and BL regions IfXLF lt XRB or XFR gt XBL the x-location increases in theright direction otherwise it increases in the left directionFurther if YLF gt YRB or YFR gt YBL the y-location increasesin the front direction otherwise it increases in the back direc-tion

6 Performance Analysis

61 Measurement of Distance Accuracy The direct distancefrom the camera to an LED can be measured by consideringa few distance parameters If D is the direct distance of theLED (119909119894 119910119895) from the camera lens and d is the distance of theformed image on the sensor from the camera lens then wecan write from the general lens equation

1119863 + 1119889 = 1119865 (5)

119889119863 = 119865119863 minus 119865 (6)

where F is the focal length of the lens Here magnificationfactor is required to introduce and compare the relative sizeof the actual size and the detected size on the IS

The magnification factor m can be written as follows

119898 = radic 119886119894119895119860 119894119895 =119889119863 (7)

where119860 119894119895 is the actual surface area of the nearest LED (119909119894 119910119895)and the area covered by the detected image on the IS is 119886119894119895

8 Wireless Communications and Mobile Computing

Table 1 Distance measurement data from an LED at different floor positions

Horizontaldistance dh(cm)

Measureddirect distance

(cm)

Actual directdistance (cm)

Error in distancemeasurement

(cm)

Detected area(px)

1st Radius119903 (px)

2nd Radius1199031015840 (px)

00 2555 2555 0 348

11

1110 25588 2559 002 347 1020 25665 2567 005 341 1030 2576 2577 01 337 1040 25885 259 015 336 1050 260 2602 02 330 960 2625 2628 03 324 970 2643 2647 04 314 980 2671 2677 06 306 990 2695 27061 111 298 8100 2725 2745 2 288 8

Normally it is observed that FltltD therefore we cancombine the above equation as follows

119886119894119895 = 1198982119860 119894119895 (8)

119886119894119895 = 11986521198632119860 119894119895 (9)

119863 = 119865radic119860 119894119895119886119894119895 (10)

where the focal length F and the actual LED size 119860 119894119895 areknown The only requirement is to obtain the detected LEDarea 119886119894119895 from the IS

Using (10) the distance can be measured accurately ata position at which the camera is vertically in the LOS ofthe LED But when it navigates through some other placeswhich are not just below the LED can be experienced someerrors during this measurement Because at that time theshape of the circular LED will be seemed like a ellipse tothe camera This happens due to the radial distortion ofthe camera [44] The wide-angle camera short focal-lengthscamera or the fisheye camera are the main reason behindthe radial distortion Determining intrinsic extrinsic anddistortion coefficient of IS and lens of the camera is requiredto omit the effect of radial distortion This radial distortionmakes a significant transformation of the projected objectfrom the actual object with a normal focal length To verifythe localization scheme a back-projection method deploysto measure area of LED from distorted image from read-world space [45] In the following section we will exhibit afigure to illustrate the accuracy of the distance measurementAlthough at the boundary position the accuracy remainsgreater than 98 we have undertaken one further step tomitigate this error Let us take 119903 as the radius of circular LEDand the area will be 1205871199032 Now when circular shape changesto eclipse one radius 119903 reduces to 1199031015840 and the area becomes1205871199031199031015840 Table 1 presents the data measured from a LED witha size of 71 cm2 located at a ceiling of height of 256 m Thepixels allocated for the detected LED at different positions are

975

980

985

990

995

1000

Perc

enta

ge o

f acc

urac

y

10 20 30 40 50 60 70 80 90 1000Horizontal distance (cm)

Figure 9 Percentage of accuracy of the measured direct distancesfrom the camera to the LED at different horizontal points

recorded to show at far distance detected contour becomessmaller than nearer We tabulated the radius of eclipse whichis required to calculate the actual size of eclipse and correctthe positioning error

This radius data is integrated within the application andwill be called when this radius will be measured by smart-phone through image processing Comparing the existingdata with the actual area of the LED can predict to measurethe direct distance

The distance between the LED and the camera changeswhen the MR moves horizontally The direct distance from aparticular LED is required to calculate the horizontal distancethat is required to move and to get the floor location ofthat particular LED We used (10) to measure the directdistance between the camera and the LED We graphicallyrepresented the percentage of accuracy that is observed whilemeasuring the direct distance at a particular point on the floorin Figure 9 The distance is measured with 100 accuracywhen the camera is situated directly below the LEDWhen the

Wireless Communications and Mobile Computing 9

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

20 40 60 80 1000Horizontal distance (cm)

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

(a)

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

20 40 60 80 1000Horizontal distance (cm)

(b)

Figure 10 Distance error comparison in terms of variation of no of LEDs inside the camera FOV at different horizontal points (a) Inter-LEDdistance = 50 cm and (b) inter-LED distance = 100 cm

camera is shifted horizontally far away from the floor locationof the LED the accuracy is reduced

To evaluate the performance of the proposed localizationalgorithm we have tested the accuracy of measuring thedistance for different type of scenarios We have got thedistance measurement error of 2 cm maximum in case ofsingle LED inside the camera FOV However when thereare multiple LEDs inside the camera FOV the distance errorincreases as the probability of LED being at the middle ofthe IS reduces Figure 10 depicts the comparison of errorin measuring the distance between LEDs and camera fordifferent cases varying the number of LEDs into the cameraFOV We have taken two scenarios first is for the case wherethe inter-LED distance is 50 cm and second is for the casewhere it is 100 cm In each scenario we have considered 2to 4 LEDs for the comparison We started measuring fromthe floor position of an LED and took 10 more reading goingaway about 10 cm in each step We can see from the figurethat if there are more LEDs into the IS at the same time theerror formeasuring the distance increasesThis is because thechance of an LED to be detected at the center of IS is beingreduced accordingly As our system comprises LEDs that aredistributed in a rectangular pattern there is more possibilityof being 4 LEDs at a time in the IS And for this case we haveto consider up to 75 cm distance error for positioning andnavigation purpose when the inter-LED distance is 50 cmBut for the second case described in Figure 10(b) the valueof the distance error increases up to 85 cm for 4 LEDs due toincrease of the inter-LED spacing to 100 cm

Though the distance error reducing with the reducednumber of LEDs being allowed into the camera FOV it isrequired to have more than two LEDs into the camera FOVto get the navigation information and we should accept thiserror anyway Also we have described a way in this sectionthat can be applied to reduce the distance error to a certainlevel for moving to a horizontal distance to reach the targetLED location increasing the positioning performance

2-FSK4-FSK8-FSK

2 3 4 5 6 7 8 9 10 111Signal to noise ratio (dB)

10minus5

10minus4

10minus3

10minus2

10minus1

100

Bit-e

rror

rate

Figure 11 BER comparison forM-FSK (M=248) in indoor wirelessRician fading channel

62 Bit Error Rate for MFSK To evaluate bit error rate(BER) performance we considered an indoor wireless Ricianfading channel MFSK is simple and flexible for multilinkcommunication over this channel It is known [46] that thebit error probability for MFSK is

120588119904 = 1119872119872sum119894=2

(minus1)119894 (119872119894 ) exp [minus (1 minus1119894 ) 120588] (11)

where M is the number of orders and 120588is the instantaneoussignal-to-noise ratio (SNR) value In Figure 11 we depicteda BER comparison graph between different modulation

10 Wireless Communications and Mobile Computing

50 cm

50 cm

50 cm

50 cm

$ =71 cG2

(a) (b)

Figure 12 Experimental setup (a) Four LEDs that are 50 cm apart from each other transmit the location IDs (b) Captured image after settingthe exposure time to 18000 sec

orders (M=248) We have seen for increasing M bit errorprobability decreasesWhen the transmission is done keepingthe same bit energy which means that the symbol durationbecomes double for the case of 4-FSK compared to 2-FSKmaintaining the same bit rate meanwhile In such case theBER becomes lower as symbol duration increases with highermodulation index

7 Demonstration Results

We used a platform to demonstrate our OCC system toevaluate its performance To understand the ID distributionsystem we should receive the location information using aminimum of two diagonally located LEDs that are describedin Section 5 The platform comprises four LED transmit-ter links that are placed in a square shape as depictedin Figure 12(a) The LEDs are arranged by maintaining adistance of 50 cm between them For better explanationwe considered these transmitters to be a single unit Thesefour transmitters are linked continuously and transmit fourdifferent locations which are encoded with the help of a PCand an LED driver circuit

To focus only on the target-transmitting LED we con-trolled the camera exposure time The exposure time is theduration at which each pixel is exposed to light Using thisprocess we can obtain the dark and white strips at a regionat which the LEDs are captured by darkening all the partsof the image except the parts that depict the target LEDsFigure 12(b) depicts the captured image using the SamsungS7 (edge) camera of the transmitting LED unit after settingthe camerarsquos CaptureRequestSENSORE EXPOSURE TIMEto 18000 secWe can observe that all the four LEDs representthe same dark and bright strip width This is because samemodulation frequencies were used for all the four transmit-ters and the frequencies ranged from 2 to 4 kHz Generally

the characteristics of the rolling image are such that the stripscan be formed horizontally However in our case we rotatedthe image using imageView at 270∘ and this is the reason forthe observation of vertically formed strips

To demodulate the data from the detected strip patternwe should initially measure the width of each dark and whitestrip We know different camera has different frame rate andalso for a fixed camera frame rate fluctuates within a shortrange For this reason the width also varies and as the widthis measured in a scale of number of pixels in a row it shouldbe an integer value In our system we observed brighter stripsat which the width varied between 5 and 7 pixels whereas itvaried between 1 and 3 pixels for darker strips Therefore themidvalue was considered to be four to distinguish betweendark and white patterns When the measured width wasgreater than four smart phone will consider the LED to bein the ON state and store a binary lsquo1rsquo value Otherwise theLED is assumed to be in the OFF state and stores a lsquo0rsquo valueFigure 13 illustrates the android implementation software thatcan simultaneously receive data from four LEDsWe receivedfive bits of ID per LED and defined each combination ofthe received binary ID to a fixed location in the applicationsystem The table in Figure 13 depicts the manner in whichthe location is assigned to different IDs To receive multipleLEDs the overall time for processing one frame should bewithin the critical frame time Otherwise the applicationmayunfortunately be stoppedThe critical frame time is defined asthe maximum processing time for one frame In our systemwe used 20 fps to limit the critical frame time to 50 msec Weadd a starting symbol to recognize when the data is startingSo the time to get the data obviously depends on the timingof getting the frame containing that symbol We test in real-time debugging the software in android studio We observethe timewhich is required for initialization frameprocessingdemodulating and device responseWefigure out that almost

Wireless Communications and Mobile Computing 11

CoordinateBits(11)00000(12)00001(21)00010(22)00011(13)00100

Figure 13 Implemented android application receiving multipleLEDs ID simultaneously

1000 ms is required to debug and getting device responsewhen communicating with 4 LEDs simultaneously As eachLED is transmitting 10 bits we got data at 40 bps rate for 4LEDs with this device

8 Conclusion

This work focuses on the reception of data from multipleindoor LED transmitters as well as on the localization andnavigation of the MR processing the received data Theindoor LED locations are received as binary data of five bitsper link at a data rate of 40 bps The proposed algorithmwas implemented in an indoor platform where four LEDshaving sizes of 71 cm2 were kept at a distance of 50 cmfrom each other The four LEDs transmit four different IDsand an android application that was developed to receivethese IDs performed well simultaneously We were able tocommunicate with the LEDs from a maximum distance of480 cm As our positioning approach requires measurementof the direct distance from the camera to the LED wecompare the accuracy of measuring the distance at differentfloor positions Maintaining the floor height to be constantwe experience an increase in the percentage of error whenthe camera moves a long distance horizontally To mitigatethis error we fix some recorded distance values within theapplication to compareWe have foundmaximum 2 cm errorat 100 cm horizontal distance from the floor position of thetarget LED when the MR moves horizontally toward thatLED Thus the accuracy is improved from the localization

and navigation viewpoint as well as based on the distance forindoor communication

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that they have no conflicts of interest

Acknowledgments

This research was supported by the MSIT (Ministry ofScience and ICT) Korea under the Global IT Talent sup-port program (IITP-2017-0-01806) supervised by the IITP(Institute for Information and Communication Technol-ogy Promotion) and the National Research Foundationof Korea (NRF) grant funded by the Korea government(Performance Enhancement and Resource Management ofVisual MIMO-based Camera Communication Networks)(No NRF-2016R1D1A1A09919091)

References

[1] M Z Chowdhury M T Hossan A Islam and Y M Jang ldquoAComparative Survey of Optical Wireless Technologies Archi-tectures and Applicationsrdquo IEEE Access vol 6 pp 9819ndash98402018

[2] M T Hossan M Z Chowdhury A Islam and Y M JangldquoA Novel Indoor Mobile Localization System Based on Opti-cal Camera Communicationrdquo Wireless Communications andMobile Computing vol 2018 17 pages 2018

[3] A Khalajmehrabadi N Gatsis and D Akopian ldquoMod-ern WLAN Fingerprinting Indoor Positioning Methods andDeployment Challengesrdquo IEEE Communications Surveys ampTutorials vol 19 no 3 pp 1974ndash2002 2017

[4] E Kupershtein M Wax and I Cohen ldquoSingle-site emitterlocalization via multipath fingerprintingrdquo IEEE Transactions onSignal Processing vol 61 no 1 pp 10ndash21 2013

[5] P Luo et al ldquoExperimental demonstration of RGB LED-basedoptical camera communicationsrdquo IEEE Photonics Journal vol 7no 5 pp 1ndash12 2015

[6] R D Roberts ldquoUndersampled frequency shift ON-OFF key-ing (UFSOOK) for camera communications (CamCom)rdquo inProceedings of the 22nd Wireless and Optical CommunicationsConference WOCC 2013 pp 645ndash648 China May 2013

[7] P Luo Z Ghassemlooy H L Minh H Tsai and X TangldquoUndersampled-PAM with subcarrier modulation for cameracommunicationsrdquo in Proceedings of the 20th Opto-Electronicsand Communications Conference (OECC) pp 1ndash3 ShanghaiChina 2015

[8] Y-Y Won S-H Yang D-H Kim and S-K Han ldquoThree-dimensional optical wireless indoor positioning system usinglocation code map based on power distribution of visible lightemitting dioderdquo IET Optoelectronics vol 7 no 3 pp 77ndash832013

[9] C Yang andH-R Shao ldquoWiFi-based indoor positioningrdquo IEEECommunications Magazine vol 53 no 3 pp 150ndash157 2015

12 Wireless Communications and Mobile Computing

[10] R Xu W Chen Y Xu and S Ji ldquoA New Indoor PositioningSystem Architecture Using GPS Signalsrdquo Sensors vol 15 no 5pp 10074ndash10087 2015

[11] Z Zhou M Kavehrad and P Deng ldquoIndoor positioning algo-rithm using light-emitting diode visible light communicationsrdquoOptical Engineering vol 51 no 8 2012

[12] W Zhang M I S Chowdhury and M Kavehrad ldquoAsyn-chronous indoor positioning system based on visible lightcommunicationsrdquo Optical Engineering vol 53 no 4 Article ID045105 2014

[13] S-Y Jung S Hann S Park and C-S Park ldquoOptical wirelessindoor positioning system using light emitting diode ceilinglightsrdquo Microwave and Optical Technology Letters vol 54 no7 pp 1622ndash1626 2012

[14] H-S Kim D-R Kim S-H Yang Y-H Son and S-K HanldquoAn indoor visible light communication positioning systemusing a RF carrier allocation techniquerdquo Journal of LightwaveTechnology vol 31 no 1 pp 134ndash144 2013

[15] K Panta and J Armstrong ldquoIndoor localisation using whiteLEDsrdquo IEEE Electronics Letters vol 48 no 4 pp 228ndash230 2012

[16] A D Cheok and L Yue ldquoA novel light-sensor-based informa-tion transmission system for indoor positioning and naviga-tionrdquo IEEE Transactions on Instrumentation and Measurementvol 60 no 1 pp 290ndash299 2011

[17] M Nakajima and S Haruyama ldquoNew indoor navigation systemfor visually impaired people using visible light communicationrdquoEURASIP Journal onWireless Communications and Networkingvol 2013 article 37 2013

[18] H Huang A Yang L Feng G Ni and P Guo ldquoIndoorPositioning Method Based on Metameric White Light Sourcesand Subpixels on aColor Image Sensorrdquo IEEEPhotonics Journalvol 8 no 6 2016

[19] M S Hossen Y Park and K Kim ldquoPerformance improvementof indoor positioning using light-emitting diodes and an imagesensor for light-emitting diode communicationrdquo Optical Engi-neering vol 54 no 3 pp 1ndash11 2015

[20] M-G Moon S-I Choi J Park and J Y Kim ldquoIndoorpositioning system using LED lights and a dual image sensorrdquoJournal of the Optical Society of Korea vol 19 no 6 pp 586ndash5912015

[21] Y-S Kuo P Pannuto K-J Hsiao and P Dutta ldquoLuxaposeIndoor positioning with mobile phones and visible lightrdquo inProceedings of the 20th ACM Annual International ConferenceonMobile Computing and Networking MobiCom 2014 pp 447ndash458 USA September 2014

[22] P Huynh and M Yoo ldquoVLC-based positioning system for anindoor environment using an image sensor and an accelerome-ter sensorrdquo Sensors vol 16 no 6 2016

[23] H Steendam T QWang and J Armstrong ldquoTheoretical lowerbound for indoor visible light positioning using received signalstrengthmeasurements and an aperture-based receiverrdquo Journalof Lightwave Technology vol 35 no 2 pp 309ndash319 2017

[24] F Yin Y Zhao F Gunnarsson and F Gustafsson ldquoReceived-signal-strength threshold optimization using Gaussian pro-cessesrdquo IEEE Transactions on Signal Processing vol 65 no 8pp 2164ndash2177 2017

[25] X Piao Y Zhang T Li et al ldquoRSS fingerprint based indoorlocalization using sparse representation with spatio-temporalconstraintrdquo Sensors vol 16 no 11 2016

[26] C Feng W S A Au S Valaee and Z Tan ldquoReceived-signal-strength-based indoor positioning using compressive sensingrdquo

IEEETransactions onMobile Computing vol 11 no 12 pp 1983ndash1993 2012

[27] W Gu W Zhang M Kavehrad and L Feng ldquoThree-dimensional light positioning algorithm with filtering tech-niques for indoor environmentsrdquo Optical Engineering vol 53no 10 2014

[28] S-H Yang E-M Jung and S-K Han ldquoIndoor locationestimation based on LED visible light communication usingmultiple optical receiversrdquo IEEE Communications Letters vol17 no 9 pp 1834ndash1837 2013

[29] F I K Mousa H Le-Minh Z Ghassemlooy et al ldquoIndoorlocalization system utilizing two visible light emitting diodesrdquoOptical Engineering vol 55 no 11 2016

[30] K Wang A Nirmalathas C Lim K Alameh and E SkafidasldquoOptical wireless-based indoor localization system employing asingle-channel imaging receiverrdquo Journal of Lightwave Technol-ogy vol 34 no 4 pp 1141ndash1149 2016

[31] S-H Jung B-CMoon andDHan ldquoUnsupervised learning forcrowdsourced indoor localization in wireless networksrdquo IEEETransactions on Mobile Computing vol 15 no 11 pp 2892ndash2906 2016

[32] X Wang L Gao S Mao and S Pandey ldquoCSI-Based Finger-printing for Indoor Localization A Deep Learning ApproachrdquoIEEE Transactions on Vehicular Technology vol 66 no 1 pp763ndash776 2017

[33] Y Hu andG Leus ldquoRobust differential received signal strength-based localizationrdquo IEEE Transactions on Signal Processing vol65 no 12 pp 3261ndash3276 2017

[34] T-H Do and M Yoo ldquoTDOA-based indoor positioning usingvisible lightrdquo Photonic Network Communications vol 27 no 2pp 80ndash88 2014

[35] S-Y Jung S Hann and C-S Park ldquoTDOA-based opticalwireless indoor localization using LED ceiling lampsrdquo IEEETransactions on Consumer Electronics vol 57 no 4 pp 1592ndash1597 2011

[36] U Nadeem N U Hassan M A Pasha and C Yuen ldquoHighlyaccurate 3D wireless indoor positioning system using whiteLED lightsrdquo IEEE Electronics Letters vol 50 no 11 pp 828ndash8302014

[37] TQWang Y A Sekercioglu ANeild and J Armstrong ldquoPosi-tion accuracy of time-of-arrival based ranging using visiblelight with application in indoor localization systemsrdquo Journalof Lightwave Technology vol 31 no 20 pp 3302ndash3308 2013

[38] O Ennasr G Xing and X Tan ldquoDistributed time-difference-of-arrival (TDOA)-based localization of a moving targetrdquo inProceedings of the 55th IEEEConference onDecision andControlCDC 2016 pp 2652ndash2658 USA December 2016

[39] S H Yang E M Jeong and S K Han ldquoIndoor positioningbased on received optical power difference by angle of arrivalrdquoIEEE Electronics Letters vol 50 no 1 pp 49ndash51 2014

[40] A Arafa X Jin and R Klukas ldquoWireless indoor opticalpositioning with a differential photosensorrdquo IEEE PhotonicsTechnology Letters vol 24 no 12 pp 1027ndash1029 2012

[41] M T Hossan M Z Chowdhury M K Hasan et al ldquoA NewVehicle Localization SchemeBased onCombinedOptical Cam-era Communication and PhotogrammetryrdquoMobile InformationSystems vol 2018 Article ID 8501898 14 pages 2018

[42] S M Berman D S Greenhouse I L Bailey R D Clearand T W Raasch ldquoHuman electroretinogram responses tovideo displays fluorescent lighting and other high frequencysourcesrdquo Optometry and Vision Science vol 68 no 8 pp 645ndash662 1991

Wireless Communications and Mobile Computing 13

[43] H-Y Lee H-M Lin Y-L Wei H-I Wu H-M Tsai andK C-J Lin ldquoRollingLight Enabling line-of-sight light-to-camera communicationsrdquo in Proceedings of the 13th AnnualInternational Conference on Mobile Systems Applications andServices MobiSys 2015 pp 167ndash180 Italy May 2015

[44] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000

[45] HAMartins J R Birk andR B Kelley ldquoCameramodels basedon data from two calibration planesrdquo Computer Graphics andImage Processing vol 17 no 2 pp 173ndash180 1981

[46] E Arthurs andH Dym ldquoOn the OptimumDetection of DigitalSignals in the Presence ofWhite Gaussian NoisemdashAGeometricInterpretation and a Study of Three Basic Data TransmissionSystemsrdquo IRE Transactions on Communications Systems vol 10no 4 pp 336ndash372 1962

International Journal of

AerospaceEngineeringHindawiwwwhindawicom Volume 2018

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Active and Passive Electronic Components

VLSI Design

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Shock and Vibration

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawiwwwhindawicom

Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Control Scienceand Engineering

Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

SensorsJournal of

Hindawiwwwhindawicom Volume 2018

International Journal of

RotatingMachinery

Hindawiwwwhindawicom Volume 2018

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Navigation and Observation

International Journal of

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

Submit your manuscripts atwwwhindawicom

Page 2: An Implementation Approach and Performance …...Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md.Shahjalal ,Md.TanvirHossan ,Moh.KhalidHasan

2 Wireless Communications and Mobile Computing

room contains LED light so there is no possibility that objectswill make any interruption during the LOS communication

OCC uses an unlicensed spectrum in which the LEDinfrastructures are used as the transmitters LED is anelectronic light source in which illumination intensity canbe controlled at a high frequency using an externally drivencircuit LED is a useful transmitter because it is energyefficient has common indoor lighting infrastructures andis cheap additionally a highly accurate data transmissionis possible due to the variable luminary properties of LEDDigital data is transmitted through the LED light by varyingthe properties of LED according to the different modulationschemes Typically for low frame rate (25sim30 fps) commer-cial cameras undersamplingmodulation techniques are usedThis technique includes phase-shift on-off keying (PSOOK)[5] frequency-shift on-off keying (FSOOK) [6] and m-arraypulse amplitude modulation (PAM) [7]

A color camera which is used as the receiver for OCCapplications typically comprises an image sensor (IS) lensand Bayer filter A camera exhibits several benefits over aphoto diode (PD) in terms of the extended field of view(FOV) and because of the fact that the pixels of the IS canreceive light fromvarious directions thereby providing a highsignal to noise ratio (SNR) Localization is possible at thecm-level of accuracy in an indoor environment under stableillumination from an external light source Several reports onpositioning using a VLC system have already been proposedIn one report a VLC-based positioning system in whichthe LEDs wirelessly transmit the location information hasbeen proposed [8] The location information signals werereceived by a PD and the spatial distribution was measuredat each reference point Further the information from peopleor objects can be obtained using map positioning There aresome localization or navigation services using OCC exhibitan external cloud server to store the mapping data and tocompute the position at which the camera is connected toserver via Wi-Fi or Bluetooth during location estimation [9]This approach exhibits a few drawbacks because it requiresan external server and an additional RF network which isexpensive and time consuming

For localizing a rolling shutter (RS) camera that wasmounted on an MR the location information should bereceived using the camera from multiple LEDs We proposean android phone mounted mobile robot positioning (MRP)approach using multiple frequency-shift keying (MFSK) inwhich we are able to obtain four locations ID from fourdifferent LEDs simultaneously We enhance distance mea-surement technique that allows maximum 2 cm error at100 cm horizontal distance in case of single LED insidethe FOV when camera and LED are not even in verticalline-of-sight (LOS) An MRN algorithm is also proposed tonavigate MR to different location Relevant work on indoorlocalization is described in Section 2 Section 3 describes theoverall scenario of the indoorMRP andMRN systems as wellas the transmitter characteristics A brief description of theandroid application and the openCV libraries rolling shuttereffects and MR specifications are provided in Section 4The proposed algorithm and the remaining operationalcharacteristics such as the camera FOV and the exploration

of ID distribution systems are described in Section 5 InSection 6 the details of the accuracy enhancement techniqueare provided A demonstration scenario and results arepresented in Section 7 Finally this research work concludesin Section 8

2 Related Works

Currently camera-bearing mobile phones are commonlyused and are important commodities in e-commerce There-fore location based services (LBS) have become an importantissue To ensure better performance in LBS it is mandatoryto measure the location of these mobile devices accuratelyespecially indoors

GPS is a pseudo-lite system which faces challengesat the point of indoor localization This system is a LOSbased localization solution which accumulates the sensorinformation from satellites that are located at a considerabledistance (20000 km) from the ground The signals of thesatellites are interrupted by the obstacles on the ground suchas trees and buildings due to the LOS channel characteristicand the considerable distance between satellites and sensorsTherefore a considerably large modification is required tomake the GPS system to be suitable for indoor localizationFor example the integration of GPS signals with the indoortransmitting antennas has been reported to localize sensornodes [10] however it is not cost effective and the localiza-tion error is observed to be 1 m in delay of 10 sec

The distinctness and novel characteristics of OWC-basedtechniques are considered important supporting candidateover existing solutions for indoor localization and navigationscenarios Several approaches of VL-based localization andnavigation schemes for mobiles PDs or wearable computershave been reported [11ndash17] There is still considerable debateabout the value of localization resolution A lower value oflocalization resolution was observed from the simulationresults of various conditional approaches An importantsubpart of OWC is OCC in which the camera receives asignal from themodulated light beam that was observed froman LED An IS detects the intensity of the metameric LEDswithin a fixed indoor environment [18] The localizationperformance exhibits a 1-degree orientation measurementand the calculated resolution was observed to be 15 and 358cm for 2D and 3D space respectively Without measuringthe angular displacement 0001-m localization resolution hasbeen observed [19] In a similar approach the localizationusing an LED transmitter and a camera as the receiver hasbeen discussed by other groups [20 21] Additionally theinformation from an accelerometer sensor has been includedfurther the demodulated data from the IS could be used toimprove the overall performance in a 3D indoor environment[22]

Popular methods for localization can be used to measurethe signal strength of the receiver ie a photodiode orcamera This signal can be either visible light or RF An RSS-based localization scheme for an aperture-based receiverwhich exhibits a wide FOV and a better angular diversityhas been proposed [23] They derived the Cramer-Rao lowerbound of position estimation for improving the overall

Wireless Communications and Mobile Computing 3

Case I

4LEDs withinFoV

Case II

z

y

xWorld co-ordinate

4 LEDs withinFoV

2 LEDs withinFoV

Case III

Position 1

Position 2Position 3

Opticalcommunication

links

(a)

Arduino High-speed switchingDC 12 V

LED

(b)

Figure 1 A scenario of the proposed model (a) MRP and MRN system using OCC and (b) transmitter set

performance Meanwhile a generic framework of RSS hasbeen introduced to enrich the positioning performance [24]Alternatively the combination of spatiotemporal constraintsin RSS fingerprint-based localization has achieved the samepurpose [25] Compressive sensing has been applied toimprove the localization performance in noisy scenarios [26]The implication of the RSS approach was also observed in theVL-based positioning schemes [27ndash29]Here the localizationresolution was observed to be 4 m [30] Importantly anartificial intelligence algorithm that was applied to collectthe location-labeled fingerprints was reported for optimizingthe overall performance [31 32] Although the positioningaccuracy that was obtained for the RSS fingerprints wasobserved to be 80 [33] a 3D space fingerprint requiredconsiderable overhead for positioning a high speed objectas compared with that obtained in a simpler 2D spacefingerprint

Time-of-arrival (TOA) and time-difference-of-arrival(TDOA) are two methods that have been used several timesto provide a solution for localization in indoor environmentsA TDOA-based localization scheme has been proposed inwhich a light beam with a unique frequency was transmit-ted in different phases [34ndash36] A TOA-based localizationalgorithm has also been developed [37] The most importantissue is that the TOA- and TDOA-based localization schemesare cost-effective and accurate Furthermore these schemesdepend on the location information from a central nodeas well as from other reference nodes in the same indoorspace An extendedKalman filter-basedTDOAapproach thatignores the impact of this dependency on the reference nodeinformation has also been proposed [38]However deployingsuch a tracking algorithm is not always advantageous becausethe extended Kalman filter failed to accurately estimate theposition as a first-order approximation

Angle-of-arrival (AOA) is another method that can beapplied for indoor localization In this approach the receiverestimates the angle of the transmitted signal by calculatingthe time difference of arrival from the individual trans-mitter Transmitting the gain difference from the LED hasbeen considered by AOA for indoor localization [30 3940] Simulation results depicted that the average value for

localization resolution was 35 cm However in our proposedsystem it is not essential to gather the angle information fromLEDs to measure the position of the camera The positionis calculated with the help of distance comparison amongseveral LEDs (eg more than two LEDs) These distancesare calculated using the photogrammetry technique [41]where the occupied image area of the LEDs on the imagesensor varies with the distance Furthermore AOA exhibitssome disadvantages eg the accuracy degrades with theincreasing distance between the transmitter and receiver andthe reflections of the signal from the multipath add someerror during the location measurement and require large andcomplex hardwire devices and the shadowing and directivityof the measuring aperture have significant effect on theoverall performance of location estimation

3 Indoor Transmitter Scenario

Figure 1(a) depicts a scenario of the proposed indoor trans-mitter and receiver scenario for MR positioning and naviga-tion system using OCC A circular shaped white LED havinga diameter of 95 cm was used as the transmitter Becausethe LED exhibits an altering luminary property it can flickerat ultrahigh speeds that are beyond the perception of thehuman eye The indoor location ID is modulated with a highmodulation frequency and encodedwith additional bits usingan LED driver circuit This encoded signal is further fed tothe LED which continuously transmits this ID using theflickering light The driver circuit comprises a 12-V dc powersupply an Arduino board and a high speed switching deviceincluding metal-oxide-semiconductor field-effect transistor(MOSFET) The transmitter set with an LED driver circuitis depicted in Figure 1(b) LEDs are installed on the ceilingand the location number is transmitted to the MR receivercamera In our system transmitters are arranged in a square-shaped regular pattern in which each side of LEDs is 50 cmapart from each other In this case the positioning system ofthe camera is mounted on top of theMR whichmaintains anequal height along the z-axis Therefore the LEDs transmitthe modulated data bits that contain only the (x y) locationinformation

4 Wireless Communications and Mobile Computing

For bit lsquo0rsquo

For bit lsquo1rsquo 500 us (2 kHz)

250 us

Data bits

Bit definition lsquo0 rarr lsquo0 0lsquo1 rarr lsquo0 1

Starting bits data bits

Bi-phase Manchester encoding

(4 kHz)

1 1 0 1 1

1 1 0 01 1 0 0 0 1 0 1 1 1 0 1

Figure 2 Transmitted data bit mechanisms

The visual flickering in an optical wireless communica-tion is a major problem that can be defined as the changein the illumination of the LED when it transmits a binarylsquo1rsquo lsquo0rsquo code-word in terms of the light that is perceived bythe human eyes Generally human eyes can avoid flickeringif the data is modulated with a frequency that is greaterthan 200 Hz [42] The LED transmitter should be drivenusing the accurate LED driver circuit to avoid any noticeableflicker Multiple frequency-shift keying (MFSK) is used asthe modulation technique to generate the modulated binarycode-words using the LED driver circuit Figure 2 depicts thecharacteristics of the transmitted bit pattern Our proposedstudy uses the modulation frequencies 2 and 4 kHz whichare safe enough to avoid flickering Further the modulatedsignal can be encoded using bi-phaseManchester line codingEach of the binary bits 1 and 0 is defined using a differentcode Bit lsquo1rsquo is defined as lsquo01rsquo whereas bit lsquo0rsquo is definedas lsquo00rsquo We transmitted five bits of data from each LEDTherefore in a single indoor system 32 location IDs canbe provided to 32 LEDs Furthermore if the actual datais lsquo11011rsquo the encoded signal will be lsquo0101000101rsquo after bi-phaseManchester line coding as shown in Figure 2 Althoughthere are 3 consecutive lsquo0rsquos in this example the bi-phaseManchester encoding eliminates the possibility of visualflickering because it confirms one transition in the middle ofeach bit As this code stream is continuously transmitted astarting bit symbol lsquo11rsquo is added at the beginning of the code-word to differentiate between every simultaneous code-word

4 Receiver Characteristics

41 Android Application Background An application is de-signed for the android studio platform using the camera2 hardware package which is added in API level 21 Someandroid features are described in this study that are usedto create the camera device and process the requests Onerequest acts to capture a single frame and output a set of imagebuffers for the request A cameraManager is required whenmultiple requests are in queue to maintain the full frame rateTo accumulate the captured frame data a getCameraCharac-teristics class is usedThe image data is encapsulated in Imageobjects and can be directly accessed through a class named

ImageReader using a YUV 420 888 formatThe image data isrendered onto a surface with defined size For processing thecaptured image frames openCV320 libraries were importedinto the application Initially the camera shutter speed iscontrolled to focus only to the white LEDs and to detect onlythe region of interest (RoI)

42 Rolling Shutter Effect Currently most of the consumercameras contain complementary metalndashoxidendashsemicon-ductor (CMOS) sensors which include rolling shuttermechanisms The sequential read-out technique is a keyfeature of the rolling shutter camera in which each frame isnot captured at exactly the same instant [43] The scene iscaptured by scanning rapidly either horizontally or verticallyunlike a global shutter camera in which the complete scene iscaptured at the same time In a rolling shutter camera eachrow of the pixels is exposed at once at the exposure timeTheread-out time protects the rows of pixels from overlappingIn a single captured image the rolling shutter allows multipleexposures Therefore for an LED which flickers on-offaccording to the modulated binary bit stream the capturedimage contains a bunch of dark and white strips The widthof the strips depends on the modulation frequencies and thenumber of strips depends on the distance

Although a different camera receives signals of similarfrequencies from the same transmitter the width of the stripsis different as the specifications of the camera sensorsmay dif-fer fromdevice to device In our system amultiple frequency-modulated signal is transmitted and the dark andwhite stripsare received at different distances as depicted in Figure 3Thecamera captured the pictures with a fixed clockwise rotationof ImageView of 270∘ As the distance increases the size ofthe LED in the IS decreases further the number of stripsalso decreases because the width remains identical for fixedfrequencies at any distance for a particular camera

43 Introduction to MR Functionalities Localization andnavigation are challenging issues in optical camera communi-cation for the existingMRsThe initial requirement is to con-nect to the input interface of theMR to provide necessary datafrom the android phone such as the location ID and its dis-tribution system When a camera captures the rolling imagesfrom multiple LEDs it processes the images and obtains thelocations thereby making a decision about the location dis-tribution system and then feeds the data to the MR The MRmust exhibit a user input interface (UI) to set a target locationID a data base (DB) to store the ID distribution informationand a display screen to exhibit the location information andthe subsequent direction to move after making a decisionFigure 4 depicts the specifications of the MR and its requiredfunctionalities It also exhibits a data and image-processingunit to compare the IDs and a database to store the previoustracking information The MR receives the IDs and cantherefore localize itself It also has a capability to compare thestored IDs and navigate by itself if the user sets a destination

5 Operational Principle

51 Navigation Algorithm The objective of the MRN systemis to navigate an MR by receiving the locations or location

Wireless Communications and Mobile Computing 5

(a) 100 cm (b) 250 cm

(c) 480 cm

Figure 3 Experimentally the number of dark and white strips decreases with the increase in distance (a) 52 strips at a distance of 100 cm(b) 28 strips at a distance of 250 cm (c) 5 strips at a distance of 480 cm

Android phonecamera

ID distributionsystem Coordinate ID

MRPUDB UI

Display screen

Location info Direction tonavigate

MR Unit

Imageprocessing

Smart Phone Unit

Figure 4 Block diagram of the mobile robot functionalities

ID using multiple LED lights Figure 5 describes a flowchartof the applied navigation algorithm in our proposed systemA user inputs a target location into the smart phone that is

mounted on the top of an indoor MR Because the completeenvironment in the room is unknown to an MR it shouldfix its position first which indicates that a location shouldbe observed inside the room Therefore it estimates thenumber of LEDs that arewithin the camerarsquos FOVTo performnavigation theMR should perceive the direction inwhich thelocation values are changingThus aminimumof three LEDsshould be within the camerarsquos FOV as depicted in Figure 1(a)in previous section Here in case I the IS can only detecttwo LEDs among which each LED exhibits the same valuefor the x location further it is difficult to observe a variationin the value of the x location In cases II and III there arefour LEDs within the camerarsquos FOV as MR is situated deepinside the room If three or more LEDs are detected using theIS the smart phone receives the LED locations and comparesthem with the target location If the target location matchesany of the received locations the MR moves in a horizontaldistance toward the LED that is located directly below thetarget location

As mentioned in the previous paragraph to localizethe MR to a floor position of a target LED we need tomeasure the horizontal distance from the MR to that floorposition Here floor position of an LED means the positionon the floor which is just below the LED Suppose the targetLED is transmitting (x y) location ID When the MR islocated in its initial position as depicted in Figure 6 theimplemented android smartphone app calculates the directdistance D from the ceiling LED The direction towards theLED is predicted from the image formation on the IS (imageformation technique on IS is well described in [42]) After

6 Wireless Communications and Mobile Computing

Detect the LEDs withincamera FoV and define as N

Get LED co-ordinates

Measure horizontaldistance to that

LED

Pass anydistance

N ge 3

Compare co-ordinates with thetarget co-ordinate

Find the nearest LEDco-ordinate to gotarget co-ordinate

Pass thatdistance

Measure thehorizontal distance to

that LED

Are the distancesame

Set target co-ordinate to MR

Pass theshortestdistance

Navigationterminated

Target co-Ordinate found

Pass that distance

Measure thehorizontal distance

to both LED

Get LEDco-ordinates

Target co-ordinate found

Yes

No

Yes

No

Yes

Yes

No

No

Figure 5 The mobile robot navigation flowchart

obtaining the direct distance we can easily find the horizontaldistance s using Pythagoras distance formula for right angletriangle as the vertical distance is fixed inside a room Aftercalculating the value of s and understanding the directionto move MR navigates this distance and fixes its locationto (x y) location If the target location is not matched withany of the received location ID the MR estimates the nearestLED that is required to reach the destination Subsequentlythe MR traverses a certain amount of distance to go apoint that is exactly below that particular LED Furtherthe MR again receives the locations of the LEDs that arewithin camerarsquos FOV Subsequently the procedure continuesuntil the destination is reached If the camera detects twoLEDs within its FOV the MR cannot detect the locationdistribution systemTherefore it initially chooses one ID andgoes to a point that is located exactly below that LED andthen MR continues with the aforementioned procedure

52 Camera Field of View We considered the indoor envi-ronment to possess a uniform square distribution of LEDsFigure 7 depicts the manner in which the number of LEDswithin camerarsquos FOV depends on different factors such asthe height of the room the distance between two adjacentLEDs and the angle of view (AOV) ie the FOV angle Forthe purpose of navigation the MR should understand thelocation distribution system among the LEDs It is obvious

that at least two diagonally located LED locations should beconsidered to make a decision about the direction in whichthe X and Y locations are increasing or decreasing A relationhas been derived to estimate the number of LEDs that werecaptured within the FOV at any time considering the cameraview angle the height of the roof and the surface area takenby a single LED

The horizontal and vertical FOV can be estimated usingthe following two equations

120601ℎ = 2tanminus1 119889ℎ2119891 (1)

120601V = 2tanminus1 119889V2119891 (2)

where 120601V and 120601ℎ are the FOV in vertical and horizontaldirections and 119889ℎ 119889V are the sensor dimensions respectively119891 represents the focal length of the camera lens Figure 7shows the camera FOV and its area is projected vertically onthe roof of height ℎ119903 We can represent the area of the FOVusing the following equation

119860119865119874119881 = 4ℎ1199032 tan (120601V) tan (120601ℎ) (3)Because the LEDs are distributed in a square pattern and

each LED separated by a distance 119886 along each side (where119886 gt LED diameter) then each LED will occupy an area of 1198862

Wireless Communications and Mobile Computing 7

TransmittedSignals

Cameramounted MR

Final position (xy)

D

s

h

sInitial position

LED(xy)

Figure 6 Calculation procedure of the distance to move horizon-tally

FOV

Camera

ℎr

2ℎr

tan(

)

2ℎr tan(

ℎ )

Figure 7 Analysis of the camerarsquos FOV

Now we can find the number of LEDs will be in thecamera FOV using (3)

119873119871119864119863 = 4ℎ1199032 tan (120601V) tan (120601ℎ)1198862 (4)

53 Exploring the ID Distribution System The smart phonethat was used to perform this study was a Samsung S7 edgewhich contained a 26-mm camera and a focal length of42 mm The height of the room that was observed usingthe camera was 256 m We set up the android camerawith a height and width of 600 and 800 px respectivelyIn a general scenario there were always at least four LEDswithin the camerarsquos FOV under this type of considerationTo define the coordination distribution system we divided

LF region FR region

RB regionBL region

(89) (8292)

(89) (8292)

Figure 8 Region basis locations selection and exploring theirdistribution system

the IS into four regions ie left-front region (LFR) front-right region (FRR) right-back region (RBR) and back-leftregion (BLR) as depicted in Figure 8 where each regioncontains an equal number of pixels The camera will detect asingle LED per region and receive the location ID as well Wedefined the locations on a regional basis which are (XLFYLF)(XFRYFR) (XRBYRB) and (XBLYBL) respectively within thefour regions It is obvious that the LEDs will be present eitherin both LF and RB regions or in both FR and BL regions IfXLF lt XRB or XFR gt XBL the x-location increases in theright direction otherwise it increases in the left directionFurther if YLF gt YRB or YFR gt YBL the y-location increasesin the front direction otherwise it increases in the back direc-tion

6 Performance Analysis

61 Measurement of Distance Accuracy The direct distancefrom the camera to an LED can be measured by consideringa few distance parameters If D is the direct distance of theLED (119909119894 119910119895) from the camera lens and d is the distance of theformed image on the sensor from the camera lens then wecan write from the general lens equation

1119863 + 1119889 = 1119865 (5)

119889119863 = 119865119863 minus 119865 (6)

where F is the focal length of the lens Here magnificationfactor is required to introduce and compare the relative sizeof the actual size and the detected size on the IS

The magnification factor m can be written as follows

119898 = radic 119886119894119895119860 119894119895 =119889119863 (7)

where119860 119894119895 is the actual surface area of the nearest LED (119909119894 119910119895)and the area covered by the detected image on the IS is 119886119894119895

8 Wireless Communications and Mobile Computing

Table 1 Distance measurement data from an LED at different floor positions

Horizontaldistance dh(cm)

Measureddirect distance

(cm)

Actual directdistance (cm)

Error in distancemeasurement

(cm)

Detected area(px)

1st Radius119903 (px)

2nd Radius1199031015840 (px)

00 2555 2555 0 348

11

1110 25588 2559 002 347 1020 25665 2567 005 341 1030 2576 2577 01 337 1040 25885 259 015 336 1050 260 2602 02 330 960 2625 2628 03 324 970 2643 2647 04 314 980 2671 2677 06 306 990 2695 27061 111 298 8100 2725 2745 2 288 8

Normally it is observed that FltltD therefore we cancombine the above equation as follows

119886119894119895 = 1198982119860 119894119895 (8)

119886119894119895 = 11986521198632119860 119894119895 (9)

119863 = 119865radic119860 119894119895119886119894119895 (10)

where the focal length F and the actual LED size 119860 119894119895 areknown The only requirement is to obtain the detected LEDarea 119886119894119895 from the IS

Using (10) the distance can be measured accurately ata position at which the camera is vertically in the LOS ofthe LED But when it navigates through some other placeswhich are not just below the LED can be experienced someerrors during this measurement Because at that time theshape of the circular LED will be seemed like a ellipse tothe camera This happens due to the radial distortion ofthe camera [44] The wide-angle camera short focal-lengthscamera or the fisheye camera are the main reason behindthe radial distortion Determining intrinsic extrinsic anddistortion coefficient of IS and lens of the camera is requiredto omit the effect of radial distortion This radial distortionmakes a significant transformation of the projected objectfrom the actual object with a normal focal length To verifythe localization scheme a back-projection method deploysto measure area of LED from distorted image from read-world space [45] In the following section we will exhibit afigure to illustrate the accuracy of the distance measurementAlthough at the boundary position the accuracy remainsgreater than 98 we have undertaken one further step tomitigate this error Let us take 119903 as the radius of circular LEDand the area will be 1205871199032 Now when circular shape changesto eclipse one radius 119903 reduces to 1199031015840 and the area becomes1205871199031199031015840 Table 1 presents the data measured from a LED witha size of 71 cm2 located at a ceiling of height of 256 m Thepixels allocated for the detected LED at different positions are

975

980

985

990

995

1000

Perc

enta

ge o

f acc

urac

y

10 20 30 40 50 60 70 80 90 1000Horizontal distance (cm)

Figure 9 Percentage of accuracy of the measured direct distancesfrom the camera to the LED at different horizontal points

recorded to show at far distance detected contour becomessmaller than nearer We tabulated the radius of eclipse whichis required to calculate the actual size of eclipse and correctthe positioning error

This radius data is integrated within the application andwill be called when this radius will be measured by smart-phone through image processing Comparing the existingdata with the actual area of the LED can predict to measurethe direct distance

The distance between the LED and the camera changeswhen the MR moves horizontally The direct distance from aparticular LED is required to calculate the horizontal distancethat is required to move and to get the floor location ofthat particular LED We used (10) to measure the directdistance between the camera and the LED We graphicallyrepresented the percentage of accuracy that is observed whilemeasuring the direct distance at a particular point on the floorin Figure 9 The distance is measured with 100 accuracywhen the camera is situated directly below the LEDWhen the

Wireless Communications and Mobile Computing 9

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

20 40 60 80 1000Horizontal distance (cm)

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

(a)

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

20 40 60 80 1000Horizontal distance (cm)

(b)

Figure 10 Distance error comparison in terms of variation of no of LEDs inside the camera FOV at different horizontal points (a) Inter-LEDdistance = 50 cm and (b) inter-LED distance = 100 cm

camera is shifted horizontally far away from the floor locationof the LED the accuracy is reduced

To evaluate the performance of the proposed localizationalgorithm we have tested the accuracy of measuring thedistance for different type of scenarios We have got thedistance measurement error of 2 cm maximum in case ofsingle LED inside the camera FOV However when thereare multiple LEDs inside the camera FOV the distance errorincreases as the probability of LED being at the middle ofthe IS reduces Figure 10 depicts the comparison of errorin measuring the distance between LEDs and camera fordifferent cases varying the number of LEDs into the cameraFOV We have taken two scenarios first is for the case wherethe inter-LED distance is 50 cm and second is for the casewhere it is 100 cm In each scenario we have considered 2to 4 LEDs for the comparison We started measuring fromthe floor position of an LED and took 10 more reading goingaway about 10 cm in each step We can see from the figurethat if there are more LEDs into the IS at the same time theerror formeasuring the distance increasesThis is because thechance of an LED to be detected at the center of IS is beingreduced accordingly As our system comprises LEDs that aredistributed in a rectangular pattern there is more possibilityof being 4 LEDs at a time in the IS And for this case we haveto consider up to 75 cm distance error for positioning andnavigation purpose when the inter-LED distance is 50 cmBut for the second case described in Figure 10(b) the valueof the distance error increases up to 85 cm for 4 LEDs due toincrease of the inter-LED spacing to 100 cm

Though the distance error reducing with the reducednumber of LEDs being allowed into the camera FOV it isrequired to have more than two LEDs into the camera FOVto get the navigation information and we should accept thiserror anyway Also we have described a way in this sectionthat can be applied to reduce the distance error to a certainlevel for moving to a horizontal distance to reach the targetLED location increasing the positioning performance

2-FSK4-FSK8-FSK

2 3 4 5 6 7 8 9 10 111Signal to noise ratio (dB)

10minus5

10minus4

10minus3

10minus2

10minus1

100

Bit-e

rror

rate

Figure 11 BER comparison forM-FSK (M=248) in indoor wirelessRician fading channel

62 Bit Error Rate for MFSK To evaluate bit error rate(BER) performance we considered an indoor wireless Ricianfading channel MFSK is simple and flexible for multilinkcommunication over this channel It is known [46] that thebit error probability for MFSK is

120588119904 = 1119872119872sum119894=2

(minus1)119894 (119872119894 ) exp [minus (1 minus1119894 ) 120588] (11)

where M is the number of orders and 120588is the instantaneoussignal-to-noise ratio (SNR) value In Figure 11 we depicteda BER comparison graph between different modulation

10 Wireless Communications and Mobile Computing

50 cm

50 cm

50 cm

50 cm

$ =71 cG2

(a) (b)

Figure 12 Experimental setup (a) Four LEDs that are 50 cm apart from each other transmit the location IDs (b) Captured image after settingthe exposure time to 18000 sec

orders (M=248) We have seen for increasing M bit errorprobability decreasesWhen the transmission is done keepingthe same bit energy which means that the symbol durationbecomes double for the case of 4-FSK compared to 2-FSKmaintaining the same bit rate meanwhile In such case theBER becomes lower as symbol duration increases with highermodulation index

7 Demonstration Results

We used a platform to demonstrate our OCC system toevaluate its performance To understand the ID distributionsystem we should receive the location information using aminimum of two diagonally located LEDs that are describedin Section 5 The platform comprises four LED transmit-ter links that are placed in a square shape as depictedin Figure 12(a) The LEDs are arranged by maintaining adistance of 50 cm between them For better explanationwe considered these transmitters to be a single unit Thesefour transmitters are linked continuously and transmit fourdifferent locations which are encoded with the help of a PCand an LED driver circuit

To focus only on the target-transmitting LED we con-trolled the camera exposure time The exposure time is theduration at which each pixel is exposed to light Using thisprocess we can obtain the dark and white strips at a regionat which the LEDs are captured by darkening all the partsof the image except the parts that depict the target LEDsFigure 12(b) depicts the captured image using the SamsungS7 (edge) camera of the transmitting LED unit after settingthe camerarsquos CaptureRequestSENSORE EXPOSURE TIMEto 18000 secWe can observe that all the four LEDs representthe same dark and bright strip width This is because samemodulation frequencies were used for all the four transmit-ters and the frequencies ranged from 2 to 4 kHz Generally

the characteristics of the rolling image are such that the stripscan be formed horizontally However in our case we rotatedthe image using imageView at 270∘ and this is the reason forthe observation of vertically formed strips

To demodulate the data from the detected strip patternwe should initially measure the width of each dark and whitestrip We know different camera has different frame rate andalso for a fixed camera frame rate fluctuates within a shortrange For this reason the width also varies and as the widthis measured in a scale of number of pixels in a row it shouldbe an integer value In our system we observed brighter stripsat which the width varied between 5 and 7 pixels whereas itvaried between 1 and 3 pixels for darker strips Therefore themidvalue was considered to be four to distinguish betweendark and white patterns When the measured width wasgreater than four smart phone will consider the LED to bein the ON state and store a binary lsquo1rsquo value Otherwise theLED is assumed to be in the OFF state and stores a lsquo0rsquo valueFigure 13 illustrates the android implementation software thatcan simultaneously receive data from four LEDsWe receivedfive bits of ID per LED and defined each combination ofthe received binary ID to a fixed location in the applicationsystem The table in Figure 13 depicts the manner in whichthe location is assigned to different IDs To receive multipleLEDs the overall time for processing one frame should bewithin the critical frame time Otherwise the applicationmayunfortunately be stoppedThe critical frame time is defined asthe maximum processing time for one frame In our systemwe used 20 fps to limit the critical frame time to 50 msec Weadd a starting symbol to recognize when the data is startingSo the time to get the data obviously depends on the timingof getting the frame containing that symbol We test in real-time debugging the software in android studio We observethe timewhich is required for initialization frameprocessingdemodulating and device responseWefigure out that almost

Wireless Communications and Mobile Computing 11

CoordinateBits(11)00000(12)00001(21)00010(22)00011(13)00100

Figure 13 Implemented android application receiving multipleLEDs ID simultaneously

1000 ms is required to debug and getting device responsewhen communicating with 4 LEDs simultaneously As eachLED is transmitting 10 bits we got data at 40 bps rate for 4LEDs with this device

8 Conclusion

This work focuses on the reception of data from multipleindoor LED transmitters as well as on the localization andnavigation of the MR processing the received data Theindoor LED locations are received as binary data of five bitsper link at a data rate of 40 bps The proposed algorithmwas implemented in an indoor platform where four LEDshaving sizes of 71 cm2 were kept at a distance of 50 cmfrom each other The four LEDs transmit four different IDsand an android application that was developed to receivethese IDs performed well simultaneously We were able tocommunicate with the LEDs from a maximum distance of480 cm As our positioning approach requires measurementof the direct distance from the camera to the LED wecompare the accuracy of measuring the distance at differentfloor positions Maintaining the floor height to be constantwe experience an increase in the percentage of error whenthe camera moves a long distance horizontally To mitigatethis error we fix some recorded distance values within theapplication to compareWe have foundmaximum 2 cm errorat 100 cm horizontal distance from the floor position of thetarget LED when the MR moves horizontally toward thatLED Thus the accuracy is improved from the localization

and navigation viewpoint as well as based on the distance forindoor communication

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that they have no conflicts of interest

Acknowledgments

This research was supported by the MSIT (Ministry ofScience and ICT) Korea under the Global IT Talent sup-port program (IITP-2017-0-01806) supervised by the IITP(Institute for Information and Communication Technol-ogy Promotion) and the National Research Foundationof Korea (NRF) grant funded by the Korea government(Performance Enhancement and Resource Management ofVisual MIMO-based Camera Communication Networks)(No NRF-2016R1D1A1A09919091)

References

[1] M Z Chowdhury M T Hossan A Islam and Y M Jang ldquoAComparative Survey of Optical Wireless Technologies Archi-tectures and Applicationsrdquo IEEE Access vol 6 pp 9819ndash98402018

[2] M T Hossan M Z Chowdhury A Islam and Y M JangldquoA Novel Indoor Mobile Localization System Based on Opti-cal Camera Communicationrdquo Wireless Communications andMobile Computing vol 2018 17 pages 2018

[3] A Khalajmehrabadi N Gatsis and D Akopian ldquoMod-ern WLAN Fingerprinting Indoor Positioning Methods andDeployment Challengesrdquo IEEE Communications Surveys ampTutorials vol 19 no 3 pp 1974ndash2002 2017

[4] E Kupershtein M Wax and I Cohen ldquoSingle-site emitterlocalization via multipath fingerprintingrdquo IEEE Transactions onSignal Processing vol 61 no 1 pp 10ndash21 2013

[5] P Luo et al ldquoExperimental demonstration of RGB LED-basedoptical camera communicationsrdquo IEEE Photonics Journal vol 7no 5 pp 1ndash12 2015

[6] R D Roberts ldquoUndersampled frequency shift ON-OFF key-ing (UFSOOK) for camera communications (CamCom)rdquo inProceedings of the 22nd Wireless and Optical CommunicationsConference WOCC 2013 pp 645ndash648 China May 2013

[7] P Luo Z Ghassemlooy H L Minh H Tsai and X TangldquoUndersampled-PAM with subcarrier modulation for cameracommunicationsrdquo in Proceedings of the 20th Opto-Electronicsand Communications Conference (OECC) pp 1ndash3 ShanghaiChina 2015

[8] Y-Y Won S-H Yang D-H Kim and S-K Han ldquoThree-dimensional optical wireless indoor positioning system usinglocation code map based on power distribution of visible lightemitting dioderdquo IET Optoelectronics vol 7 no 3 pp 77ndash832013

[9] C Yang andH-R Shao ldquoWiFi-based indoor positioningrdquo IEEECommunications Magazine vol 53 no 3 pp 150ndash157 2015

12 Wireless Communications and Mobile Computing

[10] R Xu W Chen Y Xu and S Ji ldquoA New Indoor PositioningSystem Architecture Using GPS Signalsrdquo Sensors vol 15 no 5pp 10074ndash10087 2015

[11] Z Zhou M Kavehrad and P Deng ldquoIndoor positioning algo-rithm using light-emitting diode visible light communicationsrdquoOptical Engineering vol 51 no 8 2012

[12] W Zhang M I S Chowdhury and M Kavehrad ldquoAsyn-chronous indoor positioning system based on visible lightcommunicationsrdquo Optical Engineering vol 53 no 4 Article ID045105 2014

[13] S-Y Jung S Hann S Park and C-S Park ldquoOptical wirelessindoor positioning system using light emitting diode ceilinglightsrdquo Microwave and Optical Technology Letters vol 54 no7 pp 1622ndash1626 2012

[14] H-S Kim D-R Kim S-H Yang Y-H Son and S-K HanldquoAn indoor visible light communication positioning systemusing a RF carrier allocation techniquerdquo Journal of LightwaveTechnology vol 31 no 1 pp 134ndash144 2013

[15] K Panta and J Armstrong ldquoIndoor localisation using whiteLEDsrdquo IEEE Electronics Letters vol 48 no 4 pp 228ndash230 2012

[16] A D Cheok and L Yue ldquoA novel light-sensor-based informa-tion transmission system for indoor positioning and naviga-tionrdquo IEEE Transactions on Instrumentation and Measurementvol 60 no 1 pp 290ndash299 2011

[17] M Nakajima and S Haruyama ldquoNew indoor navigation systemfor visually impaired people using visible light communicationrdquoEURASIP Journal onWireless Communications and Networkingvol 2013 article 37 2013

[18] H Huang A Yang L Feng G Ni and P Guo ldquoIndoorPositioning Method Based on Metameric White Light Sourcesand Subpixels on aColor Image Sensorrdquo IEEEPhotonics Journalvol 8 no 6 2016

[19] M S Hossen Y Park and K Kim ldquoPerformance improvementof indoor positioning using light-emitting diodes and an imagesensor for light-emitting diode communicationrdquo Optical Engi-neering vol 54 no 3 pp 1ndash11 2015

[20] M-G Moon S-I Choi J Park and J Y Kim ldquoIndoorpositioning system using LED lights and a dual image sensorrdquoJournal of the Optical Society of Korea vol 19 no 6 pp 586ndash5912015

[21] Y-S Kuo P Pannuto K-J Hsiao and P Dutta ldquoLuxaposeIndoor positioning with mobile phones and visible lightrdquo inProceedings of the 20th ACM Annual International ConferenceonMobile Computing and Networking MobiCom 2014 pp 447ndash458 USA September 2014

[22] P Huynh and M Yoo ldquoVLC-based positioning system for anindoor environment using an image sensor and an accelerome-ter sensorrdquo Sensors vol 16 no 6 2016

[23] H Steendam T QWang and J Armstrong ldquoTheoretical lowerbound for indoor visible light positioning using received signalstrengthmeasurements and an aperture-based receiverrdquo Journalof Lightwave Technology vol 35 no 2 pp 309ndash319 2017

[24] F Yin Y Zhao F Gunnarsson and F Gustafsson ldquoReceived-signal-strength threshold optimization using Gaussian pro-cessesrdquo IEEE Transactions on Signal Processing vol 65 no 8pp 2164ndash2177 2017

[25] X Piao Y Zhang T Li et al ldquoRSS fingerprint based indoorlocalization using sparse representation with spatio-temporalconstraintrdquo Sensors vol 16 no 11 2016

[26] C Feng W S A Au S Valaee and Z Tan ldquoReceived-signal-strength-based indoor positioning using compressive sensingrdquo

IEEETransactions onMobile Computing vol 11 no 12 pp 1983ndash1993 2012

[27] W Gu W Zhang M Kavehrad and L Feng ldquoThree-dimensional light positioning algorithm with filtering tech-niques for indoor environmentsrdquo Optical Engineering vol 53no 10 2014

[28] S-H Yang E-M Jung and S-K Han ldquoIndoor locationestimation based on LED visible light communication usingmultiple optical receiversrdquo IEEE Communications Letters vol17 no 9 pp 1834ndash1837 2013

[29] F I K Mousa H Le-Minh Z Ghassemlooy et al ldquoIndoorlocalization system utilizing two visible light emitting diodesrdquoOptical Engineering vol 55 no 11 2016

[30] K Wang A Nirmalathas C Lim K Alameh and E SkafidasldquoOptical wireless-based indoor localization system employing asingle-channel imaging receiverrdquo Journal of Lightwave Technol-ogy vol 34 no 4 pp 1141ndash1149 2016

[31] S-H Jung B-CMoon andDHan ldquoUnsupervised learning forcrowdsourced indoor localization in wireless networksrdquo IEEETransactions on Mobile Computing vol 15 no 11 pp 2892ndash2906 2016

[32] X Wang L Gao S Mao and S Pandey ldquoCSI-Based Finger-printing for Indoor Localization A Deep Learning ApproachrdquoIEEE Transactions on Vehicular Technology vol 66 no 1 pp763ndash776 2017

[33] Y Hu andG Leus ldquoRobust differential received signal strength-based localizationrdquo IEEE Transactions on Signal Processing vol65 no 12 pp 3261ndash3276 2017

[34] T-H Do and M Yoo ldquoTDOA-based indoor positioning usingvisible lightrdquo Photonic Network Communications vol 27 no 2pp 80ndash88 2014

[35] S-Y Jung S Hann and C-S Park ldquoTDOA-based opticalwireless indoor localization using LED ceiling lampsrdquo IEEETransactions on Consumer Electronics vol 57 no 4 pp 1592ndash1597 2011

[36] U Nadeem N U Hassan M A Pasha and C Yuen ldquoHighlyaccurate 3D wireless indoor positioning system using whiteLED lightsrdquo IEEE Electronics Letters vol 50 no 11 pp 828ndash8302014

[37] TQWang Y A Sekercioglu ANeild and J Armstrong ldquoPosi-tion accuracy of time-of-arrival based ranging using visiblelight with application in indoor localization systemsrdquo Journalof Lightwave Technology vol 31 no 20 pp 3302ndash3308 2013

[38] O Ennasr G Xing and X Tan ldquoDistributed time-difference-of-arrival (TDOA)-based localization of a moving targetrdquo inProceedings of the 55th IEEEConference onDecision andControlCDC 2016 pp 2652ndash2658 USA December 2016

[39] S H Yang E M Jeong and S K Han ldquoIndoor positioningbased on received optical power difference by angle of arrivalrdquoIEEE Electronics Letters vol 50 no 1 pp 49ndash51 2014

[40] A Arafa X Jin and R Klukas ldquoWireless indoor opticalpositioning with a differential photosensorrdquo IEEE PhotonicsTechnology Letters vol 24 no 12 pp 1027ndash1029 2012

[41] M T Hossan M Z Chowdhury M K Hasan et al ldquoA NewVehicle Localization SchemeBased onCombinedOptical Cam-era Communication and PhotogrammetryrdquoMobile InformationSystems vol 2018 Article ID 8501898 14 pages 2018

[42] S M Berman D S Greenhouse I L Bailey R D Clearand T W Raasch ldquoHuman electroretinogram responses tovideo displays fluorescent lighting and other high frequencysourcesrdquo Optometry and Vision Science vol 68 no 8 pp 645ndash662 1991

Wireless Communications and Mobile Computing 13

[43] H-Y Lee H-M Lin Y-L Wei H-I Wu H-M Tsai andK C-J Lin ldquoRollingLight Enabling line-of-sight light-to-camera communicationsrdquo in Proceedings of the 13th AnnualInternational Conference on Mobile Systems Applications andServices MobiSys 2015 pp 167ndash180 Italy May 2015

[44] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000

[45] HAMartins J R Birk andR B Kelley ldquoCameramodels basedon data from two calibration planesrdquo Computer Graphics andImage Processing vol 17 no 2 pp 173ndash180 1981

[46] E Arthurs andH Dym ldquoOn the OptimumDetection of DigitalSignals in the Presence ofWhite Gaussian NoisemdashAGeometricInterpretation and a Study of Three Basic Data TransmissionSystemsrdquo IRE Transactions on Communications Systems vol 10no 4 pp 336ndash372 1962

International Journal of

AerospaceEngineeringHindawiwwwhindawicom Volume 2018

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Active and Passive Electronic Components

VLSI Design

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Shock and Vibration

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawiwwwhindawicom

Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Control Scienceand Engineering

Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

SensorsJournal of

Hindawiwwwhindawicom Volume 2018

International Journal of

RotatingMachinery

Hindawiwwwhindawicom Volume 2018

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Navigation and Observation

International Journal of

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

Submit your manuscripts atwwwhindawicom

Page 3: An Implementation Approach and Performance …...Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md.Shahjalal ,Md.TanvirHossan ,Moh.KhalidHasan

Wireless Communications and Mobile Computing 3

Case I

4LEDs withinFoV

Case II

z

y

xWorld co-ordinate

4 LEDs withinFoV

2 LEDs withinFoV

Case III

Position 1

Position 2Position 3

Opticalcommunication

links

(a)

Arduino High-speed switchingDC 12 V

LED

(b)

Figure 1 A scenario of the proposed model (a) MRP and MRN system using OCC and (b) transmitter set

performance Meanwhile a generic framework of RSS hasbeen introduced to enrich the positioning performance [24]Alternatively the combination of spatiotemporal constraintsin RSS fingerprint-based localization has achieved the samepurpose [25] Compressive sensing has been applied toimprove the localization performance in noisy scenarios [26]The implication of the RSS approach was also observed in theVL-based positioning schemes [27ndash29]Here the localizationresolution was observed to be 4 m [30] Importantly anartificial intelligence algorithm that was applied to collectthe location-labeled fingerprints was reported for optimizingthe overall performance [31 32] Although the positioningaccuracy that was obtained for the RSS fingerprints wasobserved to be 80 [33] a 3D space fingerprint requiredconsiderable overhead for positioning a high speed objectas compared with that obtained in a simpler 2D spacefingerprint

Time-of-arrival (TOA) and time-difference-of-arrival(TDOA) are two methods that have been used several timesto provide a solution for localization in indoor environmentsA TDOA-based localization scheme has been proposed inwhich a light beam with a unique frequency was transmit-ted in different phases [34ndash36] A TOA-based localizationalgorithm has also been developed [37] The most importantissue is that the TOA- and TDOA-based localization schemesare cost-effective and accurate Furthermore these schemesdepend on the location information from a central nodeas well as from other reference nodes in the same indoorspace An extendedKalman filter-basedTDOAapproach thatignores the impact of this dependency on the reference nodeinformation has also been proposed [38]However deployingsuch a tracking algorithm is not always advantageous becausethe extended Kalman filter failed to accurately estimate theposition as a first-order approximation

Angle-of-arrival (AOA) is another method that can beapplied for indoor localization In this approach the receiverestimates the angle of the transmitted signal by calculatingthe time difference of arrival from the individual trans-mitter Transmitting the gain difference from the LED hasbeen considered by AOA for indoor localization [30 3940] Simulation results depicted that the average value for

localization resolution was 35 cm However in our proposedsystem it is not essential to gather the angle information fromLEDs to measure the position of the camera The positionis calculated with the help of distance comparison amongseveral LEDs (eg more than two LEDs) These distancesare calculated using the photogrammetry technique [41]where the occupied image area of the LEDs on the imagesensor varies with the distance Furthermore AOA exhibitssome disadvantages eg the accuracy degrades with theincreasing distance between the transmitter and receiver andthe reflections of the signal from the multipath add someerror during the location measurement and require large andcomplex hardwire devices and the shadowing and directivityof the measuring aperture have significant effect on theoverall performance of location estimation

3 Indoor Transmitter Scenario

Figure 1(a) depicts a scenario of the proposed indoor trans-mitter and receiver scenario for MR positioning and naviga-tion system using OCC A circular shaped white LED havinga diameter of 95 cm was used as the transmitter Becausethe LED exhibits an altering luminary property it can flickerat ultrahigh speeds that are beyond the perception of thehuman eye The indoor location ID is modulated with a highmodulation frequency and encodedwith additional bits usingan LED driver circuit This encoded signal is further fed tothe LED which continuously transmits this ID using theflickering light The driver circuit comprises a 12-V dc powersupply an Arduino board and a high speed switching deviceincluding metal-oxide-semiconductor field-effect transistor(MOSFET) The transmitter set with an LED driver circuitis depicted in Figure 1(b) LEDs are installed on the ceilingand the location number is transmitted to the MR receivercamera In our system transmitters are arranged in a square-shaped regular pattern in which each side of LEDs is 50 cmapart from each other In this case the positioning system ofthe camera is mounted on top of theMR whichmaintains anequal height along the z-axis Therefore the LEDs transmitthe modulated data bits that contain only the (x y) locationinformation

4 Wireless Communications and Mobile Computing

For bit lsquo0rsquo

For bit lsquo1rsquo 500 us (2 kHz)

250 us

Data bits

Bit definition lsquo0 rarr lsquo0 0lsquo1 rarr lsquo0 1

Starting bits data bits

Bi-phase Manchester encoding

(4 kHz)

1 1 0 1 1

1 1 0 01 1 0 0 0 1 0 1 1 1 0 1

Figure 2 Transmitted data bit mechanisms

The visual flickering in an optical wireless communica-tion is a major problem that can be defined as the changein the illumination of the LED when it transmits a binarylsquo1rsquo lsquo0rsquo code-word in terms of the light that is perceived bythe human eyes Generally human eyes can avoid flickeringif the data is modulated with a frequency that is greaterthan 200 Hz [42] The LED transmitter should be drivenusing the accurate LED driver circuit to avoid any noticeableflicker Multiple frequency-shift keying (MFSK) is used asthe modulation technique to generate the modulated binarycode-words using the LED driver circuit Figure 2 depicts thecharacteristics of the transmitted bit pattern Our proposedstudy uses the modulation frequencies 2 and 4 kHz whichare safe enough to avoid flickering Further the modulatedsignal can be encoded using bi-phaseManchester line codingEach of the binary bits 1 and 0 is defined using a differentcode Bit lsquo1rsquo is defined as lsquo01rsquo whereas bit lsquo0rsquo is definedas lsquo00rsquo We transmitted five bits of data from each LEDTherefore in a single indoor system 32 location IDs canbe provided to 32 LEDs Furthermore if the actual datais lsquo11011rsquo the encoded signal will be lsquo0101000101rsquo after bi-phaseManchester line coding as shown in Figure 2 Althoughthere are 3 consecutive lsquo0rsquos in this example the bi-phaseManchester encoding eliminates the possibility of visualflickering because it confirms one transition in the middle ofeach bit As this code stream is continuously transmitted astarting bit symbol lsquo11rsquo is added at the beginning of the code-word to differentiate between every simultaneous code-word

4 Receiver Characteristics

41 Android Application Background An application is de-signed for the android studio platform using the camera2 hardware package which is added in API level 21 Someandroid features are described in this study that are usedto create the camera device and process the requests Onerequest acts to capture a single frame and output a set of imagebuffers for the request A cameraManager is required whenmultiple requests are in queue to maintain the full frame rateTo accumulate the captured frame data a getCameraCharac-teristics class is usedThe image data is encapsulated in Imageobjects and can be directly accessed through a class named

ImageReader using a YUV 420 888 formatThe image data isrendered onto a surface with defined size For processing thecaptured image frames openCV320 libraries were importedinto the application Initially the camera shutter speed iscontrolled to focus only to the white LEDs and to detect onlythe region of interest (RoI)

42 Rolling Shutter Effect Currently most of the consumercameras contain complementary metalndashoxidendashsemicon-ductor (CMOS) sensors which include rolling shuttermechanisms The sequential read-out technique is a keyfeature of the rolling shutter camera in which each frame isnot captured at exactly the same instant [43] The scene iscaptured by scanning rapidly either horizontally or verticallyunlike a global shutter camera in which the complete scene iscaptured at the same time In a rolling shutter camera eachrow of the pixels is exposed at once at the exposure timeTheread-out time protects the rows of pixels from overlappingIn a single captured image the rolling shutter allows multipleexposures Therefore for an LED which flickers on-offaccording to the modulated binary bit stream the capturedimage contains a bunch of dark and white strips The widthof the strips depends on the modulation frequencies and thenumber of strips depends on the distance

Although a different camera receives signals of similarfrequencies from the same transmitter the width of the stripsis different as the specifications of the camera sensorsmay dif-fer fromdevice to device In our system amultiple frequency-modulated signal is transmitted and the dark andwhite stripsare received at different distances as depicted in Figure 3Thecamera captured the pictures with a fixed clockwise rotationof ImageView of 270∘ As the distance increases the size ofthe LED in the IS decreases further the number of stripsalso decreases because the width remains identical for fixedfrequencies at any distance for a particular camera

43 Introduction to MR Functionalities Localization andnavigation are challenging issues in optical camera communi-cation for the existingMRsThe initial requirement is to con-nect to the input interface of theMR to provide necessary datafrom the android phone such as the location ID and its dis-tribution system When a camera captures the rolling imagesfrom multiple LEDs it processes the images and obtains thelocations thereby making a decision about the location dis-tribution system and then feeds the data to the MR The MRmust exhibit a user input interface (UI) to set a target locationID a data base (DB) to store the ID distribution informationand a display screen to exhibit the location information andthe subsequent direction to move after making a decisionFigure 4 depicts the specifications of the MR and its requiredfunctionalities It also exhibits a data and image-processingunit to compare the IDs and a database to store the previoustracking information The MR receives the IDs and cantherefore localize itself It also has a capability to compare thestored IDs and navigate by itself if the user sets a destination

5 Operational Principle

51 Navigation Algorithm The objective of the MRN systemis to navigate an MR by receiving the locations or location

Wireless Communications and Mobile Computing 5

(a) 100 cm (b) 250 cm

(c) 480 cm

Figure 3 Experimentally the number of dark and white strips decreases with the increase in distance (a) 52 strips at a distance of 100 cm(b) 28 strips at a distance of 250 cm (c) 5 strips at a distance of 480 cm

Android phonecamera

ID distributionsystem Coordinate ID

MRPUDB UI

Display screen

Location info Direction tonavigate

MR Unit

Imageprocessing

Smart Phone Unit

Figure 4 Block diagram of the mobile robot functionalities

ID using multiple LED lights Figure 5 describes a flowchartof the applied navigation algorithm in our proposed systemA user inputs a target location into the smart phone that is

mounted on the top of an indoor MR Because the completeenvironment in the room is unknown to an MR it shouldfix its position first which indicates that a location shouldbe observed inside the room Therefore it estimates thenumber of LEDs that arewithin the camerarsquos FOVTo performnavigation theMR should perceive the direction inwhich thelocation values are changingThus aminimumof three LEDsshould be within the camerarsquos FOV as depicted in Figure 1(a)in previous section Here in case I the IS can only detecttwo LEDs among which each LED exhibits the same valuefor the x location further it is difficult to observe a variationin the value of the x location In cases II and III there arefour LEDs within the camerarsquos FOV as MR is situated deepinside the room If three or more LEDs are detected using theIS the smart phone receives the LED locations and comparesthem with the target location If the target location matchesany of the received locations the MR moves in a horizontaldistance toward the LED that is located directly below thetarget location

As mentioned in the previous paragraph to localizethe MR to a floor position of a target LED we need tomeasure the horizontal distance from the MR to that floorposition Here floor position of an LED means the positionon the floor which is just below the LED Suppose the targetLED is transmitting (x y) location ID When the MR islocated in its initial position as depicted in Figure 6 theimplemented android smartphone app calculates the directdistance D from the ceiling LED The direction towards theLED is predicted from the image formation on the IS (imageformation technique on IS is well described in [42]) After

6 Wireless Communications and Mobile Computing

Detect the LEDs withincamera FoV and define as N

Get LED co-ordinates

Measure horizontaldistance to that

LED

Pass anydistance

N ge 3

Compare co-ordinates with thetarget co-ordinate

Find the nearest LEDco-ordinate to gotarget co-ordinate

Pass thatdistance

Measure thehorizontal distance to

that LED

Are the distancesame

Set target co-ordinate to MR

Pass theshortestdistance

Navigationterminated

Target co-Ordinate found

Pass that distance

Measure thehorizontal distance

to both LED

Get LEDco-ordinates

Target co-ordinate found

Yes

No

Yes

No

Yes

Yes

No

No

Figure 5 The mobile robot navigation flowchart

obtaining the direct distance we can easily find the horizontaldistance s using Pythagoras distance formula for right angletriangle as the vertical distance is fixed inside a room Aftercalculating the value of s and understanding the directionto move MR navigates this distance and fixes its locationto (x y) location If the target location is not matched withany of the received location ID the MR estimates the nearestLED that is required to reach the destination Subsequentlythe MR traverses a certain amount of distance to go apoint that is exactly below that particular LED Furtherthe MR again receives the locations of the LEDs that arewithin camerarsquos FOV Subsequently the procedure continuesuntil the destination is reached If the camera detects twoLEDs within its FOV the MR cannot detect the locationdistribution systemTherefore it initially chooses one ID andgoes to a point that is located exactly below that LED andthen MR continues with the aforementioned procedure

52 Camera Field of View We considered the indoor envi-ronment to possess a uniform square distribution of LEDsFigure 7 depicts the manner in which the number of LEDswithin camerarsquos FOV depends on different factors such asthe height of the room the distance between two adjacentLEDs and the angle of view (AOV) ie the FOV angle Forthe purpose of navigation the MR should understand thelocation distribution system among the LEDs It is obvious

that at least two diagonally located LED locations should beconsidered to make a decision about the direction in whichthe X and Y locations are increasing or decreasing A relationhas been derived to estimate the number of LEDs that werecaptured within the FOV at any time considering the cameraview angle the height of the roof and the surface area takenby a single LED

The horizontal and vertical FOV can be estimated usingthe following two equations

120601ℎ = 2tanminus1 119889ℎ2119891 (1)

120601V = 2tanminus1 119889V2119891 (2)

where 120601V and 120601ℎ are the FOV in vertical and horizontaldirections and 119889ℎ 119889V are the sensor dimensions respectively119891 represents the focal length of the camera lens Figure 7shows the camera FOV and its area is projected vertically onthe roof of height ℎ119903 We can represent the area of the FOVusing the following equation

119860119865119874119881 = 4ℎ1199032 tan (120601V) tan (120601ℎ) (3)Because the LEDs are distributed in a square pattern and

each LED separated by a distance 119886 along each side (where119886 gt LED diameter) then each LED will occupy an area of 1198862

Wireless Communications and Mobile Computing 7

TransmittedSignals

Cameramounted MR

Final position (xy)

D

s

h

sInitial position

LED(xy)

Figure 6 Calculation procedure of the distance to move horizon-tally

FOV

Camera

ℎr

2ℎr

tan(

)

2ℎr tan(

ℎ )

Figure 7 Analysis of the camerarsquos FOV

Now we can find the number of LEDs will be in thecamera FOV using (3)

119873119871119864119863 = 4ℎ1199032 tan (120601V) tan (120601ℎ)1198862 (4)

53 Exploring the ID Distribution System The smart phonethat was used to perform this study was a Samsung S7 edgewhich contained a 26-mm camera and a focal length of42 mm The height of the room that was observed usingthe camera was 256 m We set up the android camerawith a height and width of 600 and 800 px respectivelyIn a general scenario there were always at least four LEDswithin the camerarsquos FOV under this type of considerationTo define the coordination distribution system we divided

LF region FR region

RB regionBL region

(89) (8292)

(89) (8292)

Figure 8 Region basis locations selection and exploring theirdistribution system

the IS into four regions ie left-front region (LFR) front-right region (FRR) right-back region (RBR) and back-leftregion (BLR) as depicted in Figure 8 where each regioncontains an equal number of pixels The camera will detect asingle LED per region and receive the location ID as well Wedefined the locations on a regional basis which are (XLFYLF)(XFRYFR) (XRBYRB) and (XBLYBL) respectively within thefour regions It is obvious that the LEDs will be present eitherin both LF and RB regions or in both FR and BL regions IfXLF lt XRB or XFR gt XBL the x-location increases in theright direction otherwise it increases in the left directionFurther if YLF gt YRB or YFR gt YBL the y-location increasesin the front direction otherwise it increases in the back direc-tion

6 Performance Analysis

61 Measurement of Distance Accuracy The direct distancefrom the camera to an LED can be measured by consideringa few distance parameters If D is the direct distance of theLED (119909119894 119910119895) from the camera lens and d is the distance of theformed image on the sensor from the camera lens then wecan write from the general lens equation

1119863 + 1119889 = 1119865 (5)

119889119863 = 119865119863 minus 119865 (6)

where F is the focal length of the lens Here magnificationfactor is required to introduce and compare the relative sizeof the actual size and the detected size on the IS

The magnification factor m can be written as follows

119898 = radic 119886119894119895119860 119894119895 =119889119863 (7)

where119860 119894119895 is the actual surface area of the nearest LED (119909119894 119910119895)and the area covered by the detected image on the IS is 119886119894119895

8 Wireless Communications and Mobile Computing

Table 1 Distance measurement data from an LED at different floor positions

Horizontaldistance dh(cm)

Measureddirect distance

(cm)

Actual directdistance (cm)

Error in distancemeasurement

(cm)

Detected area(px)

1st Radius119903 (px)

2nd Radius1199031015840 (px)

00 2555 2555 0 348

11

1110 25588 2559 002 347 1020 25665 2567 005 341 1030 2576 2577 01 337 1040 25885 259 015 336 1050 260 2602 02 330 960 2625 2628 03 324 970 2643 2647 04 314 980 2671 2677 06 306 990 2695 27061 111 298 8100 2725 2745 2 288 8

Normally it is observed that FltltD therefore we cancombine the above equation as follows

119886119894119895 = 1198982119860 119894119895 (8)

119886119894119895 = 11986521198632119860 119894119895 (9)

119863 = 119865radic119860 119894119895119886119894119895 (10)

where the focal length F and the actual LED size 119860 119894119895 areknown The only requirement is to obtain the detected LEDarea 119886119894119895 from the IS

Using (10) the distance can be measured accurately ata position at which the camera is vertically in the LOS ofthe LED But when it navigates through some other placeswhich are not just below the LED can be experienced someerrors during this measurement Because at that time theshape of the circular LED will be seemed like a ellipse tothe camera This happens due to the radial distortion ofthe camera [44] The wide-angle camera short focal-lengthscamera or the fisheye camera are the main reason behindthe radial distortion Determining intrinsic extrinsic anddistortion coefficient of IS and lens of the camera is requiredto omit the effect of radial distortion This radial distortionmakes a significant transformation of the projected objectfrom the actual object with a normal focal length To verifythe localization scheme a back-projection method deploysto measure area of LED from distorted image from read-world space [45] In the following section we will exhibit afigure to illustrate the accuracy of the distance measurementAlthough at the boundary position the accuracy remainsgreater than 98 we have undertaken one further step tomitigate this error Let us take 119903 as the radius of circular LEDand the area will be 1205871199032 Now when circular shape changesto eclipse one radius 119903 reduces to 1199031015840 and the area becomes1205871199031199031015840 Table 1 presents the data measured from a LED witha size of 71 cm2 located at a ceiling of height of 256 m Thepixels allocated for the detected LED at different positions are

975

980

985

990

995

1000

Perc

enta

ge o

f acc

urac

y

10 20 30 40 50 60 70 80 90 1000Horizontal distance (cm)

Figure 9 Percentage of accuracy of the measured direct distancesfrom the camera to the LED at different horizontal points

recorded to show at far distance detected contour becomessmaller than nearer We tabulated the radius of eclipse whichis required to calculate the actual size of eclipse and correctthe positioning error

This radius data is integrated within the application andwill be called when this radius will be measured by smart-phone through image processing Comparing the existingdata with the actual area of the LED can predict to measurethe direct distance

The distance between the LED and the camera changeswhen the MR moves horizontally The direct distance from aparticular LED is required to calculate the horizontal distancethat is required to move and to get the floor location ofthat particular LED We used (10) to measure the directdistance between the camera and the LED We graphicallyrepresented the percentage of accuracy that is observed whilemeasuring the direct distance at a particular point on the floorin Figure 9 The distance is measured with 100 accuracywhen the camera is situated directly below the LEDWhen the

Wireless Communications and Mobile Computing 9

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

20 40 60 80 1000Horizontal distance (cm)

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

(a)

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

20 40 60 80 1000Horizontal distance (cm)

(b)

Figure 10 Distance error comparison in terms of variation of no of LEDs inside the camera FOV at different horizontal points (a) Inter-LEDdistance = 50 cm and (b) inter-LED distance = 100 cm

camera is shifted horizontally far away from the floor locationof the LED the accuracy is reduced

To evaluate the performance of the proposed localizationalgorithm we have tested the accuracy of measuring thedistance for different type of scenarios We have got thedistance measurement error of 2 cm maximum in case ofsingle LED inside the camera FOV However when thereare multiple LEDs inside the camera FOV the distance errorincreases as the probability of LED being at the middle ofthe IS reduces Figure 10 depicts the comparison of errorin measuring the distance between LEDs and camera fordifferent cases varying the number of LEDs into the cameraFOV We have taken two scenarios first is for the case wherethe inter-LED distance is 50 cm and second is for the casewhere it is 100 cm In each scenario we have considered 2to 4 LEDs for the comparison We started measuring fromthe floor position of an LED and took 10 more reading goingaway about 10 cm in each step We can see from the figurethat if there are more LEDs into the IS at the same time theerror formeasuring the distance increasesThis is because thechance of an LED to be detected at the center of IS is beingreduced accordingly As our system comprises LEDs that aredistributed in a rectangular pattern there is more possibilityof being 4 LEDs at a time in the IS And for this case we haveto consider up to 75 cm distance error for positioning andnavigation purpose when the inter-LED distance is 50 cmBut for the second case described in Figure 10(b) the valueof the distance error increases up to 85 cm for 4 LEDs due toincrease of the inter-LED spacing to 100 cm

Though the distance error reducing with the reducednumber of LEDs being allowed into the camera FOV it isrequired to have more than two LEDs into the camera FOVto get the navigation information and we should accept thiserror anyway Also we have described a way in this sectionthat can be applied to reduce the distance error to a certainlevel for moving to a horizontal distance to reach the targetLED location increasing the positioning performance

2-FSK4-FSK8-FSK

2 3 4 5 6 7 8 9 10 111Signal to noise ratio (dB)

10minus5

10minus4

10minus3

10minus2

10minus1

100

Bit-e

rror

rate

Figure 11 BER comparison forM-FSK (M=248) in indoor wirelessRician fading channel

62 Bit Error Rate for MFSK To evaluate bit error rate(BER) performance we considered an indoor wireless Ricianfading channel MFSK is simple and flexible for multilinkcommunication over this channel It is known [46] that thebit error probability for MFSK is

120588119904 = 1119872119872sum119894=2

(minus1)119894 (119872119894 ) exp [minus (1 minus1119894 ) 120588] (11)

where M is the number of orders and 120588is the instantaneoussignal-to-noise ratio (SNR) value In Figure 11 we depicteda BER comparison graph between different modulation

10 Wireless Communications and Mobile Computing

50 cm

50 cm

50 cm

50 cm

$ =71 cG2

(a) (b)

Figure 12 Experimental setup (a) Four LEDs that are 50 cm apart from each other transmit the location IDs (b) Captured image after settingthe exposure time to 18000 sec

orders (M=248) We have seen for increasing M bit errorprobability decreasesWhen the transmission is done keepingthe same bit energy which means that the symbol durationbecomes double for the case of 4-FSK compared to 2-FSKmaintaining the same bit rate meanwhile In such case theBER becomes lower as symbol duration increases with highermodulation index

7 Demonstration Results

We used a platform to demonstrate our OCC system toevaluate its performance To understand the ID distributionsystem we should receive the location information using aminimum of two diagonally located LEDs that are describedin Section 5 The platform comprises four LED transmit-ter links that are placed in a square shape as depictedin Figure 12(a) The LEDs are arranged by maintaining adistance of 50 cm between them For better explanationwe considered these transmitters to be a single unit Thesefour transmitters are linked continuously and transmit fourdifferent locations which are encoded with the help of a PCand an LED driver circuit

To focus only on the target-transmitting LED we con-trolled the camera exposure time The exposure time is theduration at which each pixel is exposed to light Using thisprocess we can obtain the dark and white strips at a regionat which the LEDs are captured by darkening all the partsof the image except the parts that depict the target LEDsFigure 12(b) depicts the captured image using the SamsungS7 (edge) camera of the transmitting LED unit after settingthe camerarsquos CaptureRequestSENSORE EXPOSURE TIMEto 18000 secWe can observe that all the four LEDs representthe same dark and bright strip width This is because samemodulation frequencies were used for all the four transmit-ters and the frequencies ranged from 2 to 4 kHz Generally

the characteristics of the rolling image are such that the stripscan be formed horizontally However in our case we rotatedthe image using imageView at 270∘ and this is the reason forthe observation of vertically formed strips

To demodulate the data from the detected strip patternwe should initially measure the width of each dark and whitestrip We know different camera has different frame rate andalso for a fixed camera frame rate fluctuates within a shortrange For this reason the width also varies and as the widthis measured in a scale of number of pixels in a row it shouldbe an integer value In our system we observed brighter stripsat which the width varied between 5 and 7 pixels whereas itvaried between 1 and 3 pixels for darker strips Therefore themidvalue was considered to be four to distinguish betweendark and white patterns When the measured width wasgreater than four smart phone will consider the LED to bein the ON state and store a binary lsquo1rsquo value Otherwise theLED is assumed to be in the OFF state and stores a lsquo0rsquo valueFigure 13 illustrates the android implementation software thatcan simultaneously receive data from four LEDsWe receivedfive bits of ID per LED and defined each combination ofthe received binary ID to a fixed location in the applicationsystem The table in Figure 13 depicts the manner in whichthe location is assigned to different IDs To receive multipleLEDs the overall time for processing one frame should bewithin the critical frame time Otherwise the applicationmayunfortunately be stoppedThe critical frame time is defined asthe maximum processing time for one frame In our systemwe used 20 fps to limit the critical frame time to 50 msec Weadd a starting symbol to recognize when the data is startingSo the time to get the data obviously depends on the timingof getting the frame containing that symbol We test in real-time debugging the software in android studio We observethe timewhich is required for initialization frameprocessingdemodulating and device responseWefigure out that almost

Wireless Communications and Mobile Computing 11

CoordinateBits(11)00000(12)00001(21)00010(22)00011(13)00100

Figure 13 Implemented android application receiving multipleLEDs ID simultaneously

1000 ms is required to debug and getting device responsewhen communicating with 4 LEDs simultaneously As eachLED is transmitting 10 bits we got data at 40 bps rate for 4LEDs with this device

8 Conclusion

This work focuses on the reception of data from multipleindoor LED transmitters as well as on the localization andnavigation of the MR processing the received data Theindoor LED locations are received as binary data of five bitsper link at a data rate of 40 bps The proposed algorithmwas implemented in an indoor platform where four LEDshaving sizes of 71 cm2 were kept at a distance of 50 cmfrom each other The four LEDs transmit four different IDsand an android application that was developed to receivethese IDs performed well simultaneously We were able tocommunicate with the LEDs from a maximum distance of480 cm As our positioning approach requires measurementof the direct distance from the camera to the LED wecompare the accuracy of measuring the distance at differentfloor positions Maintaining the floor height to be constantwe experience an increase in the percentage of error whenthe camera moves a long distance horizontally To mitigatethis error we fix some recorded distance values within theapplication to compareWe have foundmaximum 2 cm errorat 100 cm horizontal distance from the floor position of thetarget LED when the MR moves horizontally toward thatLED Thus the accuracy is improved from the localization

and navigation viewpoint as well as based on the distance forindoor communication

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that they have no conflicts of interest

Acknowledgments

This research was supported by the MSIT (Ministry ofScience and ICT) Korea under the Global IT Talent sup-port program (IITP-2017-0-01806) supervised by the IITP(Institute for Information and Communication Technol-ogy Promotion) and the National Research Foundationof Korea (NRF) grant funded by the Korea government(Performance Enhancement and Resource Management ofVisual MIMO-based Camera Communication Networks)(No NRF-2016R1D1A1A09919091)

References

[1] M Z Chowdhury M T Hossan A Islam and Y M Jang ldquoAComparative Survey of Optical Wireless Technologies Archi-tectures and Applicationsrdquo IEEE Access vol 6 pp 9819ndash98402018

[2] M T Hossan M Z Chowdhury A Islam and Y M JangldquoA Novel Indoor Mobile Localization System Based on Opti-cal Camera Communicationrdquo Wireless Communications andMobile Computing vol 2018 17 pages 2018

[3] A Khalajmehrabadi N Gatsis and D Akopian ldquoMod-ern WLAN Fingerprinting Indoor Positioning Methods andDeployment Challengesrdquo IEEE Communications Surveys ampTutorials vol 19 no 3 pp 1974ndash2002 2017

[4] E Kupershtein M Wax and I Cohen ldquoSingle-site emitterlocalization via multipath fingerprintingrdquo IEEE Transactions onSignal Processing vol 61 no 1 pp 10ndash21 2013

[5] P Luo et al ldquoExperimental demonstration of RGB LED-basedoptical camera communicationsrdquo IEEE Photonics Journal vol 7no 5 pp 1ndash12 2015

[6] R D Roberts ldquoUndersampled frequency shift ON-OFF key-ing (UFSOOK) for camera communications (CamCom)rdquo inProceedings of the 22nd Wireless and Optical CommunicationsConference WOCC 2013 pp 645ndash648 China May 2013

[7] P Luo Z Ghassemlooy H L Minh H Tsai and X TangldquoUndersampled-PAM with subcarrier modulation for cameracommunicationsrdquo in Proceedings of the 20th Opto-Electronicsand Communications Conference (OECC) pp 1ndash3 ShanghaiChina 2015

[8] Y-Y Won S-H Yang D-H Kim and S-K Han ldquoThree-dimensional optical wireless indoor positioning system usinglocation code map based on power distribution of visible lightemitting dioderdquo IET Optoelectronics vol 7 no 3 pp 77ndash832013

[9] C Yang andH-R Shao ldquoWiFi-based indoor positioningrdquo IEEECommunications Magazine vol 53 no 3 pp 150ndash157 2015

12 Wireless Communications and Mobile Computing

[10] R Xu W Chen Y Xu and S Ji ldquoA New Indoor PositioningSystem Architecture Using GPS Signalsrdquo Sensors vol 15 no 5pp 10074ndash10087 2015

[11] Z Zhou M Kavehrad and P Deng ldquoIndoor positioning algo-rithm using light-emitting diode visible light communicationsrdquoOptical Engineering vol 51 no 8 2012

[12] W Zhang M I S Chowdhury and M Kavehrad ldquoAsyn-chronous indoor positioning system based on visible lightcommunicationsrdquo Optical Engineering vol 53 no 4 Article ID045105 2014

[13] S-Y Jung S Hann S Park and C-S Park ldquoOptical wirelessindoor positioning system using light emitting diode ceilinglightsrdquo Microwave and Optical Technology Letters vol 54 no7 pp 1622ndash1626 2012

[14] H-S Kim D-R Kim S-H Yang Y-H Son and S-K HanldquoAn indoor visible light communication positioning systemusing a RF carrier allocation techniquerdquo Journal of LightwaveTechnology vol 31 no 1 pp 134ndash144 2013

[15] K Panta and J Armstrong ldquoIndoor localisation using whiteLEDsrdquo IEEE Electronics Letters vol 48 no 4 pp 228ndash230 2012

[16] A D Cheok and L Yue ldquoA novel light-sensor-based informa-tion transmission system for indoor positioning and naviga-tionrdquo IEEE Transactions on Instrumentation and Measurementvol 60 no 1 pp 290ndash299 2011

[17] M Nakajima and S Haruyama ldquoNew indoor navigation systemfor visually impaired people using visible light communicationrdquoEURASIP Journal onWireless Communications and Networkingvol 2013 article 37 2013

[18] H Huang A Yang L Feng G Ni and P Guo ldquoIndoorPositioning Method Based on Metameric White Light Sourcesand Subpixels on aColor Image Sensorrdquo IEEEPhotonics Journalvol 8 no 6 2016

[19] M S Hossen Y Park and K Kim ldquoPerformance improvementof indoor positioning using light-emitting diodes and an imagesensor for light-emitting diode communicationrdquo Optical Engi-neering vol 54 no 3 pp 1ndash11 2015

[20] M-G Moon S-I Choi J Park and J Y Kim ldquoIndoorpositioning system using LED lights and a dual image sensorrdquoJournal of the Optical Society of Korea vol 19 no 6 pp 586ndash5912015

[21] Y-S Kuo P Pannuto K-J Hsiao and P Dutta ldquoLuxaposeIndoor positioning with mobile phones and visible lightrdquo inProceedings of the 20th ACM Annual International ConferenceonMobile Computing and Networking MobiCom 2014 pp 447ndash458 USA September 2014

[22] P Huynh and M Yoo ldquoVLC-based positioning system for anindoor environment using an image sensor and an accelerome-ter sensorrdquo Sensors vol 16 no 6 2016

[23] H Steendam T QWang and J Armstrong ldquoTheoretical lowerbound for indoor visible light positioning using received signalstrengthmeasurements and an aperture-based receiverrdquo Journalof Lightwave Technology vol 35 no 2 pp 309ndash319 2017

[24] F Yin Y Zhao F Gunnarsson and F Gustafsson ldquoReceived-signal-strength threshold optimization using Gaussian pro-cessesrdquo IEEE Transactions on Signal Processing vol 65 no 8pp 2164ndash2177 2017

[25] X Piao Y Zhang T Li et al ldquoRSS fingerprint based indoorlocalization using sparse representation with spatio-temporalconstraintrdquo Sensors vol 16 no 11 2016

[26] C Feng W S A Au S Valaee and Z Tan ldquoReceived-signal-strength-based indoor positioning using compressive sensingrdquo

IEEETransactions onMobile Computing vol 11 no 12 pp 1983ndash1993 2012

[27] W Gu W Zhang M Kavehrad and L Feng ldquoThree-dimensional light positioning algorithm with filtering tech-niques for indoor environmentsrdquo Optical Engineering vol 53no 10 2014

[28] S-H Yang E-M Jung and S-K Han ldquoIndoor locationestimation based on LED visible light communication usingmultiple optical receiversrdquo IEEE Communications Letters vol17 no 9 pp 1834ndash1837 2013

[29] F I K Mousa H Le-Minh Z Ghassemlooy et al ldquoIndoorlocalization system utilizing two visible light emitting diodesrdquoOptical Engineering vol 55 no 11 2016

[30] K Wang A Nirmalathas C Lim K Alameh and E SkafidasldquoOptical wireless-based indoor localization system employing asingle-channel imaging receiverrdquo Journal of Lightwave Technol-ogy vol 34 no 4 pp 1141ndash1149 2016

[31] S-H Jung B-CMoon andDHan ldquoUnsupervised learning forcrowdsourced indoor localization in wireless networksrdquo IEEETransactions on Mobile Computing vol 15 no 11 pp 2892ndash2906 2016

[32] X Wang L Gao S Mao and S Pandey ldquoCSI-Based Finger-printing for Indoor Localization A Deep Learning ApproachrdquoIEEE Transactions on Vehicular Technology vol 66 no 1 pp763ndash776 2017

[33] Y Hu andG Leus ldquoRobust differential received signal strength-based localizationrdquo IEEE Transactions on Signal Processing vol65 no 12 pp 3261ndash3276 2017

[34] T-H Do and M Yoo ldquoTDOA-based indoor positioning usingvisible lightrdquo Photonic Network Communications vol 27 no 2pp 80ndash88 2014

[35] S-Y Jung S Hann and C-S Park ldquoTDOA-based opticalwireless indoor localization using LED ceiling lampsrdquo IEEETransactions on Consumer Electronics vol 57 no 4 pp 1592ndash1597 2011

[36] U Nadeem N U Hassan M A Pasha and C Yuen ldquoHighlyaccurate 3D wireless indoor positioning system using whiteLED lightsrdquo IEEE Electronics Letters vol 50 no 11 pp 828ndash8302014

[37] TQWang Y A Sekercioglu ANeild and J Armstrong ldquoPosi-tion accuracy of time-of-arrival based ranging using visiblelight with application in indoor localization systemsrdquo Journalof Lightwave Technology vol 31 no 20 pp 3302ndash3308 2013

[38] O Ennasr G Xing and X Tan ldquoDistributed time-difference-of-arrival (TDOA)-based localization of a moving targetrdquo inProceedings of the 55th IEEEConference onDecision andControlCDC 2016 pp 2652ndash2658 USA December 2016

[39] S H Yang E M Jeong and S K Han ldquoIndoor positioningbased on received optical power difference by angle of arrivalrdquoIEEE Electronics Letters vol 50 no 1 pp 49ndash51 2014

[40] A Arafa X Jin and R Klukas ldquoWireless indoor opticalpositioning with a differential photosensorrdquo IEEE PhotonicsTechnology Letters vol 24 no 12 pp 1027ndash1029 2012

[41] M T Hossan M Z Chowdhury M K Hasan et al ldquoA NewVehicle Localization SchemeBased onCombinedOptical Cam-era Communication and PhotogrammetryrdquoMobile InformationSystems vol 2018 Article ID 8501898 14 pages 2018

[42] S M Berman D S Greenhouse I L Bailey R D Clearand T W Raasch ldquoHuman electroretinogram responses tovideo displays fluorescent lighting and other high frequencysourcesrdquo Optometry and Vision Science vol 68 no 8 pp 645ndash662 1991

Wireless Communications and Mobile Computing 13

[43] H-Y Lee H-M Lin Y-L Wei H-I Wu H-M Tsai andK C-J Lin ldquoRollingLight Enabling line-of-sight light-to-camera communicationsrdquo in Proceedings of the 13th AnnualInternational Conference on Mobile Systems Applications andServices MobiSys 2015 pp 167ndash180 Italy May 2015

[44] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000

[45] HAMartins J R Birk andR B Kelley ldquoCameramodels basedon data from two calibration planesrdquo Computer Graphics andImage Processing vol 17 no 2 pp 173ndash180 1981

[46] E Arthurs andH Dym ldquoOn the OptimumDetection of DigitalSignals in the Presence ofWhite Gaussian NoisemdashAGeometricInterpretation and a Study of Three Basic Data TransmissionSystemsrdquo IRE Transactions on Communications Systems vol 10no 4 pp 336ndash372 1962

International Journal of

AerospaceEngineeringHindawiwwwhindawicom Volume 2018

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Active and Passive Electronic Components

VLSI Design

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Shock and Vibration

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawiwwwhindawicom

Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Control Scienceand Engineering

Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

SensorsJournal of

Hindawiwwwhindawicom Volume 2018

International Journal of

RotatingMachinery

Hindawiwwwhindawicom Volume 2018

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Navigation and Observation

International Journal of

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

Submit your manuscripts atwwwhindawicom

Page 4: An Implementation Approach and Performance …...Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md.Shahjalal ,Md.TanvirHossan ,Moh.KhalidHasan

4 Wireless Communications and Mobile Computing

For bit lsquo0rsquo

For bit lsquo1rsquo 500 us (2 kHz)

250 us

Data bits

Bit definition lsquo0 rarr lsquo0 0lsquo1 rarr lsquo0 1

Starting bits data bits

Bi-phase Manchester encoding

(4 kHz)

1 1 0 1 1

1 1 0 01 1 0 0 0 1 0 1 1 1 0 1

Figure 2 Transmitted data bit mechanisms

The visual flickering in an optical wireless communica-tion is a major problem that can be defined as the changein the illumination of the LED when it transmits a binarylsquo1rsquo lsquo0rsquo code-word in terms of the light that is perceived bythe human eyes Generally human eyes can avoid flickeringif the data is modulated with a frequency that is greaterthan 200 Hz [42] The LED transmitter should be drivenusing the accurate LED driver circuit to avoid any noticeableflicker Multiple frequency-shift keying (MFSK) is used asthe modulation technique to generate the modulated binarycode-words using the LED driver circuit Figure 2 depicts thecharacteristics of the transmitted bit pattern Our proposedstudy uses the modulation frequencies 2 and 4 kHz whichare safe enough to avoid flickering Further the modulatedsignal can be encoded using bi-phaseManchester line codingEach of the binary bits 1 and 0 is defined using a differentcode Bit lsquo1rsquo is defined as lsquo01rsquo whereas bit lsquo0rsquo is definedas lsquo00rsquo We transmitted five bits of data from each LEDTherefore in a single indoor system 32 location IDs canbe provided to 32 LEDs Furthermore if the actual datais lsquo11011rsquo the encoded signal will be lsquo0101000101rsquo after bi-phaseManchester line coding as shown in Figure 2 Althoughthere are 3 consecutive lsquo0rsquos in this example the bi-phaseManchester encoding eliminates the possibility of visualflickering because it confirms one transition in the middle ofeach bit As this code stream is continuously transmitted astarting bit symbol lsquo11rsquo is added at the beginning of the code-word to differentiate between every simultaneous code-word

4 Receiver Characteristics

41 Android Application Background An application is de-signed for the android studio platform using the camera2 hardware package which is added in API level 21 Someandroid features are described in this study that are usedto create the camera device and process the requests Onerequest acts to capture a single frame and output a set of imagebuffers for the request A cameraManager is required whenmultiple requests are in queue to maintain the full frame rateTo accumulate the captured frame data a getCameraCharac-teristics class is usedThe image data is encapsulated in Imageobjects and can be directly accessed through a class named

ImageReader using a YUV 420 888 formatThe image data isrendered onto a surface with defined size For processing thecaptured image frames openCV320 libraries were importedinto the application Initially the camera shutter speed iscontrolled to focus only to the white LEDs and to detect onlythe region of interest (RoI)

42 Rolling Shutter Effect Currently most of the consumercameras contain complementary metalndashoxidendashsemicon-ductor (CMOS) sensors which include rolling shuttermechanisms The sequential read-out technique is a keyfeature of the rolling shutter camera in which each frame isnot captured at exactly the same instant [43] The scene iscaptured by scanning rapidly either horizontally or verticallyunlike a global shutter camera in which the complete scene iscaptured at the same time In a rolling shutter camera eachrow of the pixels is exposed at once at the exposure timeTheread-out time protects the rows of pixels from overlappingIn a single captured image the rolling shutter allows multipleexposures Therefore for an LED which flickers on-offaccording to the modulated binary bit stream the capturedimage contains a bunch of dark and white strips The widthof the strips depends on the modulation frequencies and thenumber of strips depends on the distance

Although a different camera receives signals of similarfrequencies from the same transmitter the width of the stripsis different as the specifications of the camera sensorsmay dif-fer fromdevice to device In our system amultiple frequency-modulated signal is transmitted and the dark andwhite stripsare received at different distances as depicted in Figure 3Thecamera captured the pictures with a fixed clockwise rotationof ImageView of 270∘ As the distance increases the size ofthe LED in the IS decreases further the number of stripsalso decreases because the width remains identical for fixedfrequencies at any distance for a particular camera

43 Introduction to MR Functionalities Localization andnavigation are challenging issues in optical camera communi-cation for the existingMRsThe initial requirement is to con-nect to the input interface of theMR to provide necessary datafrom the android phone such as the location ID and its dis-tribution system When a camera captures the rolling imagesfrom multiple LEDs it processes the images and obtains thelocations thereby making a decision about the location dis-tribution system and then feeds the data to the MR The MRmust exhibit a user input interface (UI) to set a target locationID a data base (DB) to store the ID distribution informationand a display screen to exhibit the location information andthe subsequent direction to move after making a decisionFigure 4 depicts the specifications of the MR and its requiredfunctionalities It also exhibits a data and image-processingunit to compare the IDs and a database to store the previoustracking information The MR receives the IDs and cantherefore localize itself It also has a capability to compare thestored IDs and navigate by itself if the user sets a destination

5 Operational Principle

51 Navigation Algorithm The objective of the MRN systemis to navigate an MR by receiving the locations or location

Wireless Communications and Mobile Computing 5

(a) 100 cm (b) 250 cm

(c) 480 cm

Figure 3 Experimentally the number of dark and white strips decreases with the increase in distance (a) 52 strips at a distance of 100 cm(b) 28 strips at a distance of 250 cm (c) 5 strips at a distance of 480 cm

Android phonecamera

ID distributionsystem Coordinate ID

MRPUDB UI

Display screen

Location info Direction tonavigate

MR Unit

Imageprocessing

Smart Phone Unit

Figure 4 Block diagram of the mobile robot functionalities

ID using multiple LED lights Figure 5 describes a flowchartof the applied navigation algorithm in our proposed systemA user inputs a target location into the smart phone that is

mounted on the top of an indoor MR Because the completeenvironment in the room is unknown to an MR it shouldfix its position first which indicates that a location shouldbe observed inside the room Therefore it estimates thenumber of LEDs that arewithin the camerarsquos FOVTo performnavigation theMR should perceive the direction inwhich thelocation values are changingThus aminimumof three LEDsshould be within the camerarsquos FOV as depicted in Figure 1(a)in previous section Here in case I the IS can only detecttwo LEDs among which each LED exhibits the same valuefor the x location further it is difficult to observe a variationin the value of the x location In cases II and III there arefour LEDs within the camerarsquos FOV as MR is situated deepinside the room If three or more LEDs are detected using theIS the smart phone receives the LED locations and comparesthem with the target location If the target location matchesany of the received locations the MR moves in a horizontaldistance toward the LED that is located directly below thetarget location

As mentioned in the previous paragraph to localizethe MR to a floor position of a target LED we need tomeasure the horizontal distance from the MR to that floorposition Here floor position of an LED means the positionon the floor which is just below the LED Suppose the targetLED is transmitting (x y) location ID When the MR islocated in its initial position as depicted in Figure 6 theimplemented android smartphone app calculates the directdistance D from the ceiling LED The direction towards theLED is predicted from the image formation on the IS (imageformation technique on IS is well described in [42]) After

6 Wireless Communications and Mobile Computing

Detect the LEDs withincamera FoV and define as N

Get LED co-ordinates

Measure horizontaldistance to that

LED

Pass anydistance

N ge 3

Compare co-ordinates with thetarget co-ordinate

Find the nearest LEDco-ordinate to gotarget co-ordinate

Pass thatdistance

Measure thehorizontal distance to

that LED

Are the distancesame

Set target co-ordinate to MR

Pass theshortestdistance

Navigationterminated

Target co-Ordinate found

Pass that distance

Measure thehorizontal distance

to both LED

Get LEDco-ordinates

Target co-ordinate found

Yes

No

Yes

No

Yes

Yes

No

No

Figure 5 The mobile robot navigation flowchart

obtaining the direct distance we can easily find the horizontaldistance s using Pythagoras distance formula for right angletriangle as the vertical distance is fixed inside a room Aftercalculating the value of s and understanding the directionto move MR navigates this distance and fixes its locationto (x y) location If the target location is not matched withany of the received location ID the MR estimates the nearestLED that is required to reach the destination Subsequentlythe MR traverses a certain amount of distance to go apoint that is exactly below that particular LED Furtherthe MR again receives the locations of the LEDs that arewithin camerarsquos FOV Subsequently the procedure continuesuntil the destination is reached If the camera detects twoLEDs within its FOV the MR cannot detect the locationdistribution systemTherefore it initially chooses one ID andgoes to a point that is located exactly below that LED andthen MR continues with the aforementioned procedure

52 Camera Field of View We considered the indoor envi-ronment to possess a uniform square distribution of LEDsFigure 7 depicts the manner in which the number of LEDswithin camerarsquos FOV depends on different factors such asthe height of the room the distance between two adjacentLEDs and the angle of view (AOV) ie the FOV angle Forthe purpose of navigation the MR should understand thelocation distribution system among the LEDs It is obvious

that at least two diagonally located LED locations should beconsidered to make a decision about the direction in whichthe X and Y locations are increasing or decreasing A relationhas been derived to estimate the number of LEDs that werecaptured within the FOV at any time considering the cameraview angle the height of the roof and the surface area takenby a single LED

The horizontal and vertical FOV can be estimated usingthe following two equations

120601ℎ = 2tanminus1 119889ℎ2119891 (1)

120601V = 2tanminus1 119889V2119891 (2)

where 120601V and 120601ℎ are the FOV in vertical and horizontaldirections and 119889ℎ 119889V are the sensor dimensions respectively119891 represents the focal length of the camera lens Figure 7shows the camera FOV and its area is projected vertically onthe roof of height ℎ119903 We can represent the area of the FOVusing the following equation

119860119865119874119881 = 4ℎ1199032 tan (120601V) tan (120601ℎ) (3)Because the LEDs are distributed in a square pattern and

each LED separated by a distance 119886 along each side (where119886 gt LED diameter) then each LED will occupy an area of 1198862

Wireless Communications and Mobile Computing 7

TransmittedSignals

Cameramounted MR

Final position (xy)

D

s

h

sInitial position

LED(xy)

Figure 6 Calculation procedure of the distance to move horizon-tally

FOV

Camera

ℎr

2ℎr

tan(

)

2ℎr tan(

ℎ )

Figure 7 Analysis of the camerarsquos FOV

Now we can find the number of LEDs will be in thecamera FOV using (3)

119873119871119864119863 = 4ℎ1199032 tan (120601V) tan (120601ℎ)1198862 (4)

53 Exploring the ID Distribution System The smart phonethat was used to perform this study was a Samsung S7 edgewhich contained a 26-mm camera and a focal length of42 mm The height of the room that was observed usingthe camera was 256 m We set up the android camerawith a height and width of 600 and 800 px respectivelyIn a general scenario there were always at least four LEDswithin the camerarsquos FOV under this type of considerationTo define the coordination distribution system we divided

LF region FR region

RB regionBL region

(89) (8292)

(89) (8292)

Figure 8 Region basis locations selection and exploring theirdistribution system

the IS into four regions ie left-front region (LFR) front-right region (FRR) right-back region (RBR) and back-leftregion (BLR) as depicted in Figure 8 where each regioncontains an equal number of pixels The camera will detect asingle LED per region and receive the location ID as well Wedefined the locations on a regional basis which are (XLFYLF)(XFRYFR) (XRBYRB) and (XBLYBL) respectively within thefour regions It is obvious that the LEDs will be present eitherin both LF and RB regions or in both FR and BL regions IfXLF lt XRB or XFR gt XBL the x-location increases in theright direction otherwise it increases in the left directionFurther if YLF gt YRB or YFR gt YBL the y-location increasesin the front direction otherwise it increases in the back direc-tion

6 Performance Analysis

61 Measurement of Distance Accuracy The direct distancefrom the camera to an LED can be measured by consideringa few distance parameters If D is the direct distance of theLED (119909119894 119910119895) from the camera lens and d is the distance of theformed image on the sensor from the camera lens then wecan write from the general lens equation

1119863 + 1119889 = 1119865 (5)

119889119863 = 119865119863 minus 119865 (6)

where F is the focal length of the lens Here magnificationfactor is required to introduce and compare the relative sizeof the actual size and the detected size on the IS

The magnification factor m can be written as follows

119898 = radic 119886119894119895119860 119894119895 =119889119863 (7)

where119860 119894119895 is the actual surface area of the nearest LED (119909119894 119910119895)and the area covered by the detected image on the IS is 119886119894119895

8 Wireless Communications and Mobile Computing

Table 1 Distance measurement data from an LED at different floor positions

Horizontaldistance dh(cm)

Measureddirect distance

(cm)

Actual directdistance (cm)

Error in distancemeasurement

(cm)

Detected area(px)

1st Radius119903 (px)

2nd Radius1199031015840 (px)

00 2555 2555 0 348

11

1110 25588 2559 002 347 1020 25665 2567 005 341 1030 2576 2577 01 337 1040 25885 259 015 336 1050 260 2602 02 330 960 2625 2628 03 324 970 2643 2647 04 314 980 2671 2677 06 306 990 2695 27061 111 298 8100 2725 2745 2 288 8

Normally it is observed that FltltD therefore we cancombine the above equation as follows

119886119894119895 = 1198982119860 119894119895 (8)

119886119894119895 = 11986521198632119860 119894119895 (9)

119863 = 119865radic119860 119894119895119886119894119895 (10)

where the focal length F and the actual LED size 119860 119894119895 areknown The only requirement is to obtain the detected LEDarea 119886119894119895 from the IS

Using (10) the distance can be measured accurately ata position at which the camera is vertically in the LOS ofthe LED But when it navigates through some other placeswhich are not just below the LED can be experienced someerrors during this measurement Because at that time theshape of the circular LED will be seemed like a ellipse tothe camera This happens due to the radial distortion ofthe camera [44] The wide-angle camera short focal-lengthscamera or the fisheye camera are the main reason behindthe radial distortion Determining intrinsic extrinsic anddistortion coefficient of IS and lens of the camera is requiredto omit the effect of radial distortion This radial distortionmakes a significant transformation of the projected objectfrom the actual object with a normal focal length To verifythe localization scheme a back-projection method deploysto measure area of LED from distorted image from read-world space [45] In the following section we will exhibit afigure to illustrate the accuracy of the distance measurementAlthough at the boundary position the accuracy remainsgreater than 98 we have undertaken one further step tomitigate this error Let us take 119903 as the radius of circular LEDand the area will be 1205871199032 Now when circular shape changesto eclipse one radius 119903 reduces to 1199031015840 and the area becomes1205871199031199031015840 Table 1 presents the data measured from a LED witha size of 71 cm2 located at a ceiling of height of 256 m Thepixels allocated for the detected LED at different positions are

975

980

985

990

995

1000

Perc

enta

ge o

f acc

urac

y

10 20 30 40 50 60 70 80 90 1000Horizontal distance (cm)

Figure 9 Percentage of accuracy of the measured direct distancesfrom the camera to the LED at different horizontal points

recorded to show at far distance detected contour becomessmaller than nearer We tabulated the radius of eclipse whichis required to calculate the actual size of eclipse and correctthe positioning error

This radius data is integrated within the application andwill be called when this radius will be measured by smart-phone through image processing Comparing the existingdata with the actual area of the LED can predict to measurethe direct distance

The distance between the LED and the camera changeswhen the MR moves horizontally The direct distance from aparticular LED is required to calculate the horizontal distancethat is required to move and to get the floor location ofthat particular LED We used (10) to measure the directdistance between the camera and the LED We graphicallyrepresented the percentage of accuracy that is observed whilemeasuring the direct distance at a particular point on the floorin Figure 9 The distance is measured with 100 accuracywhen the camera is situated directly below the LEDWhen the

Wireless Communications and Mobile Computing 9

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

20 40 60 80 1000Horizontal distance (cm)

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

(a)

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

20 40 60 80 1000Horizontal distance (cm)

(b)

Figure 10 Distance error comparison in terms of variation of no of LEDs inside the camera FOV at different horizontal points (a) Inter-LEDdistance = 50 cm and (b) inter-LED distance = 100 cm

camera is shifted horizontally far away from the floor locationof the LED the accuracy is reduced

To evaluate the performance of the proposed localizationalgorithm we have tested the accuracy of measuring thedistance for different type of scenarios We have got thedistance measurement error of 2 cm maximum in case ofsingle LED inside the camera FOV However when thereare multiple LEDs inside the camera FOV the distance errorincreases as the probability of LED being at the middle ofthe IS reduces Figure 10 depicts the comparison of errorin measuring the distance between LEDs and camera fordifferent cases varying the number of LEDs into the cameraFOV We have taken two scenarios first is for the case wherethe inter-LED distance is 50 cm and second is for the casewhere it is 100 cm In each scenario we have considered 2to 4 LEDs for the comparison We started measuring fromthe floor position of an LED and took 10 more reading goingaway about 10 cm in each step We can see from the figurethat if there are more LEDs into the IS at the same time theerror formeasuring the distance increasesThis is because thechance of an LED to be detected at the center of IS is beingreduced accordingly As our system comprises LEDs that aredistributed in a rectangular pattern there is more possibilityof being 4 LEDs at a time in the IS And for this case we haveto consider up to 75 cm distance error for positioning andnavigation purpose when the inter-LED distance is 50 cmBut for the second case described in Figure 10(b) the valueof the distance error increases up to 85 cm for 4 LEDs due toincrease of the inter-LED spacing to 100 cm

Though the distance error reducing with the reducednumber of LEDs being allowed into the camera FOV it isrequired to have more than two LEDs into the camera FOVto get the navigation information and we should accept thiserror anyway Also we have described a way in this sectionthat can be applied to reduce the distance error to a certainlevel for moving to a horizontal distance to reach the targetLED location increasing the positioning performance

2-FSK4-FSK8-FSK

2 3 4 5 6 7 8 9 10 111Signal to noise ratio (dB)

10minus5

10minus4

10minus3

10minus2

10minus1

100

Bit-e

rror

rate

Figure 11 BER comparison forM-FSK (M=248) in indoor wirelessRician fading channel

62 Bit Error Rate for MFSK To evaluate bit error rate(BER) performance we considered an indoor wireless Ricianfading channel MFSK is simple and flexible for multilinkcommunication over this channel It is known [46] that thebit error probability for MFSK is

120588119904 = 1119872119872sum119894=2

(minus1)119894 (119872119894 ) exp [minus (1 minus1119894 ) 120588] (11)

where M is the number of orders and 120588is the instantaneoussignal-to-noise ratio (SNR) value In Figure 11 we depicteda BER comparison graph between different modulation

10 Wireless Communications and Mobile Computing

50 cm

50 cm

50 cm

50 cm

$ =71 cG2

(a) (b)

Figure 12 Experimental setup (a) Four LEDs that are 50 cm apart from each other transmit the location IDs (b) Captured image after settingthe exposure time to 18000 sec

orders (M=248) We have seen for increasing M bit errorprobability decreasesWhen the transmission is done keepingthe same bit energy which means that the symbol durationbecomes double for the case of 4-FSK compared to 2-FSKmaintaining the same bit rate meanwhile In such case theBER becomes lower as symbol duration increases with highermodulation index

7 Demonstration Results

We used a platform to demonstrate our OCC system toevaluate its performance To understand the ID distributionsystem we should receive the location information using aminimum of two diagonally located LEDs that are describedin Section 5 The platform comprises four LED transmit-ter links that are placed in a square shape as depictedin Figure 12(a) The LEDs are arranged by maintaining adistance of 50 cm between them For better explanationwe considered these transmitters to be a single unit Thesefour transmitters are linked continuously and transmit fourdifferent locations which are encoded with the help of a PCand an LED driver circuit

To focus only on the target-transmitting LED we con-trolled the camera exposure time The exposure time is theduration at which each pixel is exposed to light Using thisprocess we can obtain the dark and white strips at a regionat which the LEDs are captured by darkening all the partsof the image except the parts that depict the target LEDsFigure 12(b) depicts the captured image using the SamsungS7 (edge) camera of the transmitting LED unit after settingthe camerarsquos CaptureRequestSENSORE EXPOSURE TIMEto 18000 secWe can observe that all the four LEDs representthe same dark and bright strip width This is because samemodulation frequencies were used for all the four transmit-ters and the frequencies ranged from 2 to 4 kHz Generally

the characteristics of the rolling image are such that the stripscan be formed horizontally However in our case we rotatedthe image using imageView at 270∘ and this is the reason forthe observation of vertically formed strips

To demodulate the data from the detected strip patternwe should initially measure the width of each dark and whitestrip We know different camera has different frame rate andalso for a fixed camera frame rate fluctuates within a shortrange For this reason the width also varies and as the widthis measured in a scale of number of pixels in a row it shouldbe an integer value In our system we observed brighter stripsat which the width varied between 5 and 7 pixels whereas itvaried between 1 and 3 pixels for darker strips Therefore themidvalue was considered to be four to distinguish betweendark and white patterns When the measured width wasgreater than four smart phone will consider the LED to bein the ON state and store a binary lsquo1rsquo value Otherwise theLED is assumed to be in the OFF state and stores a lsquo0rsquo valueFigure 13 illustrates the android implementation software thatcan simultaneously receive data from four LEDsWe receivedfive bits of ID per LED and defined each combination ofthe received binary ID to a fixed location in the applicationsystem The table in Figure 13 depicts the manner in whichthe location is assigned to different IDs To receive multipleLEDs the overall time for processing one frame should bewithin the critical frame time Otherwise the applicationmayunfortunately be stoppedThe critical frame time is defined asthe maximum processing time for one frame In our systemwe used 20 fps to limit the critical frame time to 50 msec Weadd a starting symbol to recognize when the data is startingSo the time to get the data obviously depends on the timingof getting the frame containing that symbol We test in real-time debugging the software in android studio We observethe timewhich is required for initialization frameprocessingdemodulating and device responseWefigure out that almost

Wireless Communications and Mobile Computing 11

CoordinateBits(11)00000(12)00001(21)00010(22)00011(13)00100

Figure 13 Implemented android application receiving multipleLEDs ID simultaneously

1000 ms is required to debug and getting device responsewhen communicating with 4 LEDs simultaneously As eachLED is transmitting 10 bits we got data at 40 bps rate for 4LEDs with this device

8 Conclusion

This work focuses on the reception of data from multipleindoor LED transmitters as well as on the localization andnavigation of the MR processing the received data Theindoor LED locations are received as binary data of five bitsper link at a data rate of 40 bps The proposed algorithmwas implemented in an indoor platform where four LEDshaving sizes of 71 cm2 were kept at a distance of 50 cmfrom each other The four LEDs transmit four different IDsand an android application that was developed to receivethese IDs performed well simultaneously We were able tocommunicate with the LEDs from a maximum distance of480 cm As our positioning approach requires measurementof the direct distance from the camera to the LED wecompare the accuracy of measuring the distance at differentfloor positions Maintaining the floor height to be constantwe experience an increase in the percentage of error whenthe camera moves a long distance horizontally To mitigatethis error we fix some recorded distance values within theapplication to compareWe have foundmaximum 2 cm errorat 100 cm horizontal distance from the floor position of thetarget LED when the MR moves horizontally toward thatLED Thus the accuracy is improved from the localization

and navigation viewpoint as well as based on the distance forindoor communication

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that they have no conflicts of interest

Acknowledgments

This research was supported by the MSIT (Ministry ofScience and ICT) Korea under the Global IT Talent sup-port program (IITP-2017-0-01806) supervised by the IITP(Institute for Information and Communication Technol-ogy Promotion) and the National Research Foundationof Korea (NRF) grant funded by the Korea government(Performance Enhancement and Resource Management ofVisual MIMO-based Camera Communication Networks)(No NRF-2016R1D1A1A09919091)

References

[1] M Z Chowdhury M T Hossan A Islam and Y M Jang ldquoAComparative Survey of Optical Wireless Technologies Archi-tectures and Applicationsrdquo IEEE Access vol 6 pp 9819ndash98402018

[2] M T Hossan M Z Chowdhury A Islam and Y M JangldquoA Novel Indoor Mobile Localization System Based on Opti-cal Camera Communicationrdquo Wireless Communications andMobile Computing vol 2018 17 pages 2018

[3] A Khalajmehrabadi N Gatsis and D Akopian ldquoMod-ern WLAN Fingerprinting Indoor Positioning Methods andDeployment Challengesrdquo IEEE Communications Surveys ampTutorials vol 19 no 3 pp 1974ndash2002 2017

[4] E Kupershtein M Wax and I Cohen ldquoSingle-site emitterlocalization via multipath fingerprintingrdquo IEEE Transactions onSignal Processing vol 61 no 1 pp 10ndash21 2013

[5] P Luo et al ldquoExperimental demonstration of RGB LED-basedoptical camera communicationsrdquo IEEE Photonics Journal vol 7no 5 pp 1ndash12 2015

[6] R D Roberts ldquoUndersampled frequency shift ON-OFF key-ing (UFSOOK) for camera communications (CamCom)rdquo inProceedings of the 22nd Wireless and Optical CommunicationsConference WOCC 2013 pp 645ndash648 China May 2013

[7] P Luo Z Ghassemlooy H L Minh H Tsai and X TangldquoUndersampled-PAM with subcarrier modulation for cameracommunicationsrdquo in Proceedings of the 20th Opto-Electronicsand Communications Conference (OECC) pp 1ndash3 ShanghaiChina 2015

[8] Y-Y Won S-H Yang D-H Kim and S-K Han ldquoThree-dimensional optical wireless indoor positioning system usinglocation code map based on power distribution of visible lightemitting dioderdquo IET Optoelectronics vol 7 no 3 pp 77ndash832013

[9] C Yang andH-R Shao ldquoWiFi-based indoor positioningrdquo IEEECommunications Magazine vol 53 no 3 pp 150ndash157 2015

12 Wireless Communications and Mobile Computing

[10] R Xu W Chen Y Xu and S Ji ldquoA New Indoor PositioningSystem Architecture Using GPS Signalsrdquo Sensors vol 15 no 5pp 10074ndash10087 2015

[11] Z Zhou M Kavehrad and P Deng ldquoIndoor positioning algo-rithm using light-emitting diode visible light communicationsrdquoOptical Engineering vol 51 no 8 2012

[12] W Zhang M I S Chowdhury and M Kavehrad ldquoAsyn-chronous indoor positioning system based on visible lightcommunicationsrdquo Optical Engineering vol 53 no 4 Article ID045105 2014

[13] S-Y Jung S Hann S Park and C-S Park ldquoOptical wirelessindoor positioning system using light emitting diode ceilinglightsrdquo Microwave and Optical Technology Letters vol 54 no7 pp 1622ndash1626 2012

[14] H-S Kim D-R Kim S-H Yang Y-H Son and S-K HanldquoAn indoor visible light communication positioning systemusing a RF carrier allocation techniquerdquo Journal of LightwaveTechnology vol 31 no 1 pp 134ndash144 2013

[15] K Panta and J Armstrong ldquoIndoor localisation using whiteLEDsrdquo IEEE Electronics Letters vol 48 no 4 pp 228ndash230 2012

[16] A D Cheok and L Yue ldquoA novel light-sensor-based informa-tion transmission system for indoor positioning and naviga-tionrdquo IEEE Transactions on Instrumentation and Measurementvol 60 no 1 pp 290ndash299 2011

[17] M Nakajima and S Haruyama ldquoNew indoor navigation systemfor visually impaired people using visible light communicationrdquoEURASIP Journal onWireless Communications and Networkingvol 2013 article 37 2013

[18] H Huang A Yang L Feng G Ni and P Guo ldquoIndoorPositioning Method Based on Metameric White Light Sourcesand Subpixels on aColor Image Sensorrdquo IEEEPhotonics Journalvol 8 no 6 2016

[19] M S Hossen Y Park and K Kim ldquoPerformance improvementof indoor positioning using light-emitting diodes and an imagesensor for light-emitting diode communicationrdquo Optical Engi-neering vol 54 no 3 pp 1ndash11 2015

[20] M-G Moon S-I Choi J Park and J Y Kim ldquoIndoorpositioning system using LED lights and a dual image sensorrdquoJournal of the Optical Society of Korea vol 19 no 6 pp 586ndash5912015

[21] Y-S Kuo P Pannuto K-J Hsiao and P Dutta ldquoLuxaposeIndoor positioning with mobile phones and visible lightrdquo inProceedings of the 20th ACM Annual International ConferenceonMobile Computing and Networking MobiCom 2014 pp 447ndash458 USA September 2014

[22] P Huynh and M Yoo ldquoVLC-based positioning system for anindoor environment using an image sensor and an accelerome-ter sensorrdquo Sensors vol 16 no 6 2016

[23] H Steendam T QWang and J Armstrong ldquoTheoretical lowerbound for indoor visible light positioning using received signalstrengthmeasurements and an aperture-based receiverrdquo Journalof Lightwave Technology vol 35 no 2 pp 309ndash319 2017

[24] F Yin Y Zhao F Gunnarsson and F Gustafsson ldquoReceived-signal-strength threshold optimization using Gaussian pro-cessesrdquo IEEE Transactions on Signal Processing vol 65 no 8pp 2164ndash2177 2017

[25] X Piao Y Zhang T Li et al ldquoRSS fingerprint based indoorlocalization using sparse representation with spatio-temporalconstraintrdquo Sensors vol 16 no 11 2016

[26] C Feng W S A Au S Valaee and Z Tan ldquoReceived-signal-strength-based indoor positioning using compressive sensingrdquo

IEEETransactions onMobile Computing vol 11 no 12 pp 1983ndash1993 2012

[27] W Gu W Zhang M Kavehrad and L Feng ldquoThree-dimensional light positioning algorithm with filtering tech-niques for indoor environmentsrdquo Optical Engineering vol 53no 10 2014

[28] S-H Yang E-M Jung and S-K Han ldquoIndoor locationestimation based on LED visible light communication usingmultiple optical receiversrdquo IEEE Communications Letters vol17 no 9 pp 1834ndash1837 2013

[29] F I K Mousa H Le-Minh Z Ghassemlooy et al ldquoIndoorlocalization system utilizing two visible light emitting diodesrdquoOptical Engineering vol 55 no 11 2016

[30] K Wang A Nirmalathas C Lim K Alameh and E SkafidasldquoOptical wireless-based indoor localization system employing asingle-channel imaging receiverrdquo Journal of Lightwave Technol-ogy vol 34 no 4 pp 1141ndash1149 2016

[31] S-H Jung B-CMoon andDHan ldquoUnsupervised learning forcrowdsourced indoor localization in wireless networksrdquo IEEETransactions on Mobile Computing vol 15 no 11 pp 2892ndash2906 2016

[32] X Wang L Gao S Mao and S Pandey ldquoCSI-Based Finger-printing for Indoor Localization A Deep Learning ApproachrdquoIEEE Transactions on Vehicular Technology vol 66 no 1 pp763ndash776 2017

[33] Y Hu andG Leus ldquoRobust differential received signal strength-based localizationrdquo IEEE Transactions on Signal Processing vol65 no 12 pp 3261ndash3276 2017

[34] T-H Do and M Yoo ldquoTDOA-based indoor positioning usingvisible lightrdquo Photonic Network Communications vol 27 no 2pp 80ndash88 2014

[35] S-Y Jung S Hann and C-S Park ldquoTDOA-based opticalwireless indoor localization using LED ceiling lampsrdquo IEEETransactions on Consumer Electronics vol 57 no 4 pp 1592ndash1597 2011

[36] U Nadeem N U Hassan M A Pasha and C Yuen ldquoHighlyaccurate 3D wireless indoor positioning system using whiteLED lightsrdquo IEEE Electronics Letters vol 50 no 11 pp 828ndash8302014

[37] TQWang Y A Sekercioglu ANeild and J Armstrong ldquoPosi-tion accuracy of time-of-arrival based ranging using visiblelight with application in indoor localization systemsrdquo Journalof Lightwave Technology vol 31 no 20 pp 3302ndash3308 2013

[38] O Ennasr G Xing and X Tan ldquoDistributed time-difference-of-arrival (TDOA)-based localization of a moving targetrdquo inProceedings of the 55th IEEEConference onDecision andControlCDC 2016 pp 2652ndash2658 USA December 2016

[39] S H Yang E M Jeong and S K Han ldquoIndoor positioningbased on received optical power difference by angle of arrivalrdquoIEEE Electronics Letters vol 50 no 1 pp 49ndash51 2014

[40] A Arafa X Jin and R Klukas ldquoWireless indoor opticalpositioning with a differential photosensorrdquo IEEE PhotonicsTechnology Letters vol 24 no 12 pp 1027ndash1029 2012

[41] M T Hossan M Z Chowdhury M K Hasan et al ldquoA NewVehicle Localization SchemeBased onCombinedOptical Cam-era Communication and PhotogrammetryrdquoMobile InformationSystems vol 2018 Article ID 8501898 14 pages 2018

[42] S M Berman D S Greenhouse I L Bailey R D Clearand T W Raasch ldquoHuman electroretinogram responses tovideo displays fluorescent lighting and other high frequencysourcesrdquo Optometry and Vision Science vol 68 no 8 pp 645ndash662 1991

Wireless Communications and Mobile Computing 13

[43] H-Y Lee H-M Lin Y-L Wei H-I Wu H-M Tsai andK C-J Lin ldquoRollingLight Enabling line-of-sight light-to-camera communicationsrdquo in Proceedings of the 13th AnnualInternational Conference on Mobile Systems Applications andServices MobiSys 2015 pp 167ndash180 Italy May 2015

[44] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000

[45] HAMartins J R Birk andR B Kelley ldquoCameramodels basedon data from two calibration planesrdquo Computer Graphics andImage Processing vol 17 no 2 pp 173ndash180 1981

[46] E Arthurs andH Dym ldquoOn the OptimumDetection of DigitalSignals in the Presence ofWhite Gaussian NoisemdashAGeometricInterpretation and a Study of Three Basic Data TransmissionSystemsrdquo IRE Transactions on Communications Systems vol 10no 4 pp 336ndash372 1962

International Journal of

AerospaceEngineeringHindawiwwwhindawicom Volume 2018

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Active and Passive Electronic Components

VLSI Design

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Shock and Vibration

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawiwwwhindawicom

Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Control Scienceand Engineering

Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

SensorsJournal of

Hindawiwwwhindawicom Volume 2018

International Journal of

RotatingMachinery

Hindawiwwwhindawicom Volume 2018

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Navigation and Observation

International Journal of

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

Submit your manuscripts atwwwhindawicom

Page 5: An Implementation Approach and Performance …...Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md.Shahjalal ,Md.TanvirHossan ,Moh.KhalidHasan

Wireless Communications and Mobile Computing 5

(a) 100 cm (b) 250 cm

(c) 480 cm

Figure 3 Experimentally the number of dark and white strips decreases with the increase in distance (a) 52 strips at a distance of 100 cm(b) 28 strips at a distance of 250 cm (c) 5 strips at a distance of 480 cm

Android phonecamera

ID distributionsystem Coordinate ID

MRPUDB UI

Display screen

Location info Direction tonavigate

MR Unit

Imageprocessing

Smart Phone Unit

Figure 4 Block diagram of the mobile robot functionalities

ID using multiple LED lights Figure 5 describes a flowchartof the applied navigation algorithm in our proposed systemA user inputs a target location into the smart phone that is

mounted on the top of an indoor MR Because the completeenvironment in the room is unknown to an MR it shouldfix its position first which indicates that a location shouldbe observed inside the room Therefore it estimates thenumber of LEDs that arewithin the camerarsquos FOVTo performnavigation theMR should perceive the direction inwhich thelocation values are changingThus aminimumof three LEDsshould be within the camerarsquos FOV as depicted in Figure 1(a)in previous section Here in case I the IS can only detecttwo LEDs among which each LED exhibits the same valuefor the x location further it is difficult to observe a variationin the value of the x location In cases II and III there arefour LEDs within the camerarsquos FOV as MR is situated deepinside the room If three or more LEDs are detected using theIS the smart phone receives the LED locations and comparesthem with the target location If the target location matchesany of the received locations the MR moves in a horizontaldistance toward the LED that is located directly below thetarget location

As mentioned in the previous paragraph to localizethe MR to a floor position of a target LED we need tomeasure the horizontal distance from the MR to that floorposition Here floor position of an LED means the positionon the floor which is just below the LED Suppose the targetLED is transmitting (x y) location ID When the MR islocated in its initial position as depicted in Figure 6 theimplemented android smartphone app calculates the directdistance D from the ceiling LED The direction towards theLED is predicted from the image formation on the IS (imageformation technique on IS is well described in [42]) After

6 Wireless Communications and Mobile Computing

Detect the LEDs withincamera FoV and define as N

Get LED co-ordinates

Measure horizontaldistance to that

LED

Pass anydistance

N ge 3

Compare co-ordinates with thetarget co-ordinate

Find the nearest LEDco-ordinate to gotarget co-ordinate

Pass thatdistance

Measure thehorizontal distance to

that LED

Are the distancesame

Set target co-ordinate to MR

Pass theshortestdistance

Navigationterminated

Target co-Ordinate found

Pass that distance

Measure thehorizontal distance

to both LED

Get LEDco-ordinates

Target co-ordinate found

Yes

No

Yes

No

Yes

Yes

No

No

Figure 5 The mobile robot navigation flowchart

obtaining the direct distance we can easily find the horizontaldistance s using Pythagoras distance formula for right angletriangle as the vertical distance is fixed inside a room Aftercalculating the value of s and understanding the directionto move MR navigates this distance and fixes its locationto (x y) location If the target location is not matched withany of the received location ID the MR estimates the nearestLED that is required to reach the destination Subsequentlythe MR traverses a certain amount of distance to go apoint that is exactly below that particular LED Furtherthe MR again receives the locations of the LEDs that arewithin camerarsquos FOV Subsequently the procedure continuesuntil the destination is reached If the camera detects twoLEDs within its FOV the MR cannot detect the locationdistribution systemTherefore it initially chooses one ID andgoes to a point that is located exactly below that LED andthen MR continues with the aforementioned procedure

52 Camera Field of View We considered the indoor envi-ronment to possess a uniform square distribution of LEDsFigure 7 depicts the manner in which the number of LEDswithin camerarsquos FOV depends on different factors such asthe height of the room the distance between two adjacentLEDs and the angle of view (AOV) ie the FOV angle Forthe purpose of navigation the MR should understand thelocation distribution system among the LEDs It is obvious

that at least two diagonally located LED locations should beconsidered to make a decision about the direction in whichthe X and Y locations are increasing or decreasing A relationhas been derived to estimate the number of LEDs that werecaptured within the FOV at any time considering the cameraview angle the height of the roof and the surface area takenby a single LED

The horizontal and vertical FOV can be estimated usingthe following two equations

120601ℎ = 2tanminus1 119889ℎ2119891 (1)

120601V = 2tanminus1 119889V2119891 (2)

where 120601V and 120601ℎ are the FOV in vertical and horizontaldirections and 119889ℎ 119889V are the sensor dimensions respectively119891 represents the focal length of the camera lens Figure 7shows the camera FOV and its area is projected vertically onthe roof of height ℎ119903 We can represent the area of the FOVusing the following equation

119860119865119874119881 = 4ℎ1199032 tan (120601V) tan (120601ℎ) (3)Because the LEDs are distributed in a square pattern and

each LED separated by a distance 119886 along each side (where119886 gt LED diameter) then each LED will occupy an area of 1198862

Wireless Communications and Mobile Computing 7

TransmittedSignals

Cameramounted MR

Final position (xy)

D

s

h

sInitial position

LED(xy)

Figure 6 Calculation procedure of the distance to move horizon-tally

FOV

Camera

ℎr

2ℎr

tan(

)

2ℎr tan(

ℎ )

Figure 7 Analysis of the camerarsquos FOV

Now we can find the number of LEDs will be in thecamera FOV using (3)

119873119871119864119863 = 4ℎ1199032 tan (120601V) tan (120601ℎ)1198862 (4)

53 Exploring the ID Distribution System The smart phonethat was used to perform this study was a Samsung S7 edgewhich contained a 26-mm camera and a focal length of42 mm The height of the room that was observed usingthe camera was 256 m We set up the android camerawith a height and width of 600 and 800 px respectivelyIn a general scenario there were always at least four LEDswithin the camerarsquos FOV under this type of considerationTo define the coordination distribution system we divided

LF region FR region

RB regionBL region

(89) (8292)

(89) (8292)

Figure 8 Region basis locations selection and exploring theirdistribution system

the IS into four regions ie left-front region (LFR) front-right region (FRR) right-back region (RBR) and back-leftregion (BLR) as depicted in Figure 8 where each regioncontains an equal number of pixels The camera will detect asingle LED per region and receive the location ID as well Wedefined the locations on a regional basis which are (XLFYLF)(XFRYFR) (XRBYRB) and (XBLYBL) respectively within thefour regions It is obvious that the LEDs will be present eitherin both LF and RB regions or in both FR and BL regions IfXLF lt XRB or XFR gt XBL the x-location increases in theright direction otherwise it increases in the left directionFurther if YLF gt YRB or YFR gt YBL the y-location increasesin the front direction otherwise it increases in the back direc-tion

6 Performance Analysis

61 Measurement of Distance Accuracy The direct distancefrom the camera to an LED can be measured by consideringa few distance parameters If D is the direct distance of theLED (119909119894 119910119895) from the camera lens and d is the distance of theformed image on the sensor from the camera lens then wecan write from the general lens equation

1119863 + 1119889 = 1119865 (5)

119889119863 = 119865119863 minus 119865 (6)

where F is the focal length of the lens Here magnificationfactor is required to introduce and compare the relative sizeof the actual size and the detected size on the IS

The magnification factor m can be written as follows

119898 = radic 119886119894119895119860 119894119895 =119889119863 (7)

where119860 119894119895 is the actual surface area of the nearest LED (119909119894 119910119895)and the area covered by the detected image on the IS is 119886119894119895

8 Wireless Communications and Mobile Computing

Table 1 Distance measurement data from an LED at different floor positions

Horizontaldistance dh(cm)

Measureddirect distance

(cm)

Actual directdistance (cm)

Error in distancemeasurement

(cm)

Detected area(px)

1st Radius119903 (px)

2nd Radius1199031015840 (px)

00 2555 2555 0 348

11

1110 25588 2559 002 347 1020 25665 2567 005 341 1030 2576 2577 01 337 1040 25885 259 015 336 1050 260 2602 02 330 960 2625 2628 03 324 970 2643 2647 04 314 980 2671 2677 06 306 990 2695 27061 111 298 8100 2725 2745 2 288 8

Normally it is observed that FltltD therefore we cancombine the above equation as follows

119886119894119895 = 1198982119860 119894119895 (8)

119886119894119895 = 11986521198632119860 119894119895 (9)

119863 = 119865radic119860 119894119895119886119894119895 (10)

where the focal length F and the actual LED size 119860 119894119895 areknown The only requirement is to obtain the detected LEDarea 119886119894119895 from the IS

Using (10) the distance can be measured accurately ata position at which the camera is vertically in the LOS ofthe LED But when it navigates through some other placeswhich are not just below the LED can be experienced someerrors during this measurement Because at that time theshape of the circular LED will be seemed like a ellipse tothe camera This happens due to the radial distortion ofthe camera [44] The wide-angle camera short focal-lengthscamera or the fisheye camera are the main reason behindthe radial distortion Determining intrinsic extrinsic anddistortion coefficient of IS and lens of the camera is requiredto omit the effect of radial distortion This radial distortionmakes a significant transformation of the projected objectfrom the actual object with a normal focal length To verifythe localization scheme a back-projection method deploysto measure area of LED from distorted image from read-world space [45] In the following section we will exhibit afigure to illustrate the accuracy of the distance measurementAlthough at the boundary position the accuracy remainsgreater than 98 we have undertaken one further step tomitigate this error Let us take 119903 as the radius of circular LEDand the area will be 1205871199032 Now when circular shape changesto eclipse one radius 119903 reduces to 1199031015840 and the area becomes1205871199031199031015840 Table 1 presents the data measured from a LED witha size of 71 cm2 located at a ceiling of height of 256 m Thepixels allocated for the detected LED at different positions are

975

980

985

990

995

1000

Perc

enta

ge o

f acc

urac

y

10 20 30 40 50 60 70 80 90 1000Horizontal distance (cm)

Figure 9 Percentage of accuracy of the measured direct distancesfrom the camera to the LED at different horizontal points

recorded to show at far distance detected contour becomessmaller than nearer We tabulated the radius of eclipse whichis required to calculate the actual size of eclipse and correctthe positioning error

This radius data is integrated within the application andwill be called when this radius will be measured by smart-phone through image processing Comparing the existingdata with the actual area of the LED can predict to measurethe direct distance

The distance between the LED and the camera changeswhen the MR moves horizontally The direct distance from aparticular LED is required to calculate the horizontal distancethat is required to move and to get the floor location ofthat particular LED We used (10) to measure the directdistance between the camera and the LED We graphicallyrepresented the percentage of accuracy that is observed whilemeasuring the direct distance at a particular point on the floorin Figure 9 The distance is measured with 100 accuracywhen the camera is situated directly below the LEDWhen the

Wireless Communications and Mobile Computing 9

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

20 40 60 80 1000Horizontal distance (cm)

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

(a)

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

20 40 60 80 1000Horizontal distance (cm)

(b)

Figure 10 Distance error comparison in terms of variation of no of LEDs inside the camera FOV at different horizontal points (a) Inter-LEDdistance = 50 cm and (b) inter-LED distance = 100 cm

camera is shifted horizontally far away from the floor locationof the LED the accuracy is reduced

To evaluate the performance of the proposed localizationalgorithm we have tested the accuracy of measuring thedistance for different type of scenarios We have got thedistance measurement error of 2 cm maximum in case ofsingle LED inside the camera FOV However when thereare multiple LEDs inside the camera FOV the distance errorincreases as the probability of LED being at the middle ofthe IS reduces Figure 10 depicts the comparison of errorin measuring the distance between LEDs and camera fordifferent cases varying the number of LEDs into the cameraFOV We have taken two scenarios first is for the case wherethe inter-LED distance is 50 cm and second is for the casewhere it is 100 cm In each scenario we have considered 2to 4 LEDs for the comparison We started measuring fromthe floor position of an LED and took 10 more reading goingaway about 10 cm in each step We can see from the figurethat if there are more LEDs into the IS at the same time theerror formeasuring the distance increasesThis is because thechance of an LED to be detected at the center of IS is beingreduced accordingly As our system comprises LEDs that aredistributed in a rectangular pattern there is more possibilityof being 4 LEDs at a time in the IS And for this case we haveto consider up to 75 cm distance error for positioning andnavigation purpose when the inter-LED distance is 50 cmBut for the second case described in Figure 10(b) the valueof the distance error increases up to 85 cm for 4 LEDs due toincrease of the inter-LED spacing to 100 cm

Though the distance error reducing with the reducednumber of LEDs being allowed into the camera FOV it isrequired to have more than two LEDs into the camera FOVto get the navigation information and we should accept thiserror anyway Also we have described a way in this sectionthat can be applied to reduce the distance error to a certainlevel for moving to a horizontal distance to reach the targetLED location increasing the positioning performance

2-FSK4-FSK8-FSK

2 3 4 5 6 7 8 9 10 111Signal to noise ratio (dB)

10minus5

10minus4

10minus3

10minus2

10minus1

100

Bit-e

rror

rate

Figure 11 BER comparison forM-FSK (M=248) in indoor wirelessRician fading channel

62 Bit Error Rate for MFSK To evaluate bit error rate(BER) performance we considered an indoor wireless Ricianfading channel MFSK is simple and flexible for multilinkcommunication over this channel It is known [46] that thebit error probability for MFSK is

120588119904 = 1119872119872sum119894=2

(minus1)119894 (119872119894 ) exp [minus (1 minus1119894 ) 120588] (11)

where M is the number of orders and 120588is the instantaneoussignal-to-noise ratio (SNR) value In Figure 11 we depicteda BER comparison graph between different modulation

10 Wireless Communications and Mobile Computing

50 cm

50 cm

50 cm

50 cm

$ =71 cG2

(a) (b)

Figure 12 Experimental setup (a) Four LEDs that are 50 cm apart from each other transmit the location IDs (b) Captured image after settingthe exposure time to 18000 sec

orders (M=248) We have seen for increasing M bit errorprobability decreasesWhen the transmission is done keepingthe same bit energy which means that the symbol durationbecomes double for the case of 4-FSK compared to 2-FSKmaintaining the same bit rate meanwhile In such case theBER becomes lower as symbol duration increases with highermodulation index

7 Demonstration Results

We used a platform to demonstrate our OCC system toevaluate its performance To understand the ID distributionsystem we should receive the location information using aminimum of two diagonally located LEDs that are describedin Section 5 The platform comprises four LED transmit-ter links that are placed in a square shape as depictedin Figure 12(a) The LEDs are arranged by maintaining adistance of 50 cm between them For better explanationwe considered these transmitters to be a single unit Thesefour transmitters are linked continuously and transmit fourdifferent locations which are encoded with the help of a PCand an LED driver circuit

To focus only on the target-transmitting LED we con-trolled the camera exposure time The exposure time is theduration at which each pixel is exposed to light Using thisprocess we can obtain the dark and white strips at a regionat which the LEDs are captured by darkening all the partsof the image except the parts that depict the target LEDsFigure 12(b) depicts the captured image using the SamsungS7 (edge) camera of the transmitting LED unit after settingthe camerarsquos CaptureRequestSENSORE EXPOSURE TIMEto 18000 secWe can observe that all the four LEDs representthe same dark and bright strip width This is because samemodulation frequencies were used for all the four transmit-ters and the frequencies ranged from 2 to 4 kHz Generally

the characteristics of the rolling image are such that the stripscan be formed horizontally However in our case we rotatedthe image using imageView at 270∘ and this is the reason forthe observation of vertically formed strips

To demodulate the data from the detected strip patternwe should initially measure the width of each dark and whitestrip We know different camera has different frame rate andalso for a fixed camera frame rate fluctuates within a shortrange For this reason the width also varies and as the widthis measured in a scale of number of pixels in a row it shouldbe an integer value In our system we observed brighter stripsat which the width varied between 5 and 7 pixels whereas itvaried between 1 and 3 pixels for darker strips Therefore themidvalue was considered to be four to distinguish betweendark and white patterns When the measured width wasgreater than four smart phone will consider the LED to bein the ON state and store a binary lsquo1rsquo value Otherwise theLED is assumed to be in the OFF state and stores a lsquo0rsquo valueFigure 13 illustrates the android implementation software thatcan simultaneously receive data from four LEDsWe receivedfive bits of ID per LED and defined each combination ofthe received binary ID to a fixed location in the applicationsystem The table in Figure 13 depicts the manner in whichthe location is assigned to different IDs To receive multipleLEDs the overall time for processing one frame should bewithin the critical frame time Otherwise the applicationmayunfortunately be stoppedThe critical frame time is defined asthe maximum processing time for one frame In our systemwe used 20 fps to limit the critical frame time to 50 msec Weadd a starting symbol to recognize when the data is startingSo the time to get the data obviously depends on the timingof getting the frame containing that symbol We test in real-time debugging the software in android studio We observethe timewhich is required for initialization frameprocessingdemodulating and device responseWefigure out that almost

Wireless Communications and Mobile Computing 11

CoordinateBits(11)00000(12)00001(21)00010(22)00011(13)00100

Figure 13 Implemented android application receiving multipleLEDs ID simultaneously

1000 ms is required to debug and getting device responsewhen communicating with 4 LEDs simultaneously As eachLED is transmitting 10 bits we got data at 40 bps rate for 4LEDs with this device

8 Conclusion

This work focuses on the reception of data from multipleindoor LED transmitters as well as on the localization andnavigation of the MR processing the received data Theindoor LED locations are received as binary data of five bitsper link at a data rate of 40 bps The proposed algorithmwas implemented in an indoor platform where four LEDshaving sizes of 71 cm2 were kept at a distance of 50 cmfrom each other The four LEDs transmit four different IDsand an android application that was developed to receivethese IDs performed well simultaneously We were able tocommunicate with the LEDs from a maximum distance of480 cm As our positioning approach requires measurementof the direct distance from the camera to the LED wecompare the accuracy of measuring the distance at differentfloor positions Maintaining the floor height to be constantwe experience an increase in the percentage of error whenthe camera moves a long distance horizontally To mitigatethis error we fix some recorded distance values within theapplication to compareWe have foundmaximum 2 cm errorat 100 cm horizontal distance from the floor position of thetarget LED when the MR moves horizontally toward thatLED Thus the accuracy is improved from the localization

and navigation viewpoint as well as based on the distance forindoor communication

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that they have no conflicts of interest

Acknowledgments

This research was supported by the MSIT (Ministry ofScience and ICT) Korea under the Global IT Talent sup-port program (IITP-2017-0-01806) supervised by the IITP(Institute for Information and Communication Technol-ogy Promotion) and the National Research Foundationof Korea (NRF) grant funded by the Korea government(Performance Enhancement and Resource Management ofVisual MIMO-based Camera Communication Networks)(No NRF-2016R1D1A1A09919091)

References

[1] M Z Chowdhury M T Hossan A Islam and Y M Jang ldquoAComparative Survey of Optical Wireless Technologies Archi-tectures and Applicationsrdquo IEEE Access vol 6 pp 9819ndash98402018

[2] M T Hossan M Z Chowdhury A Islam and Y M JangldquoA Novel Indoor Mobile Localization System Based on Opti-cal Camera Communicationrdquo Wireless Communications andMobile Computing vol 2018 17 pages 2018

[3] A Khalajmehrabadi N Gatsis and D Akopian ldquoMod-ern WLAN Fingerprinting Indoor Positioning Methods andDeployment Challengesrdquo IEEE Communications Surveys ampTutorials vol 19 no 3 pp 1974ndash2002 2017

[4] E Kupershtein M Wax and I Cohen ldquoSingle-site emitterlocalization via multipath fingerprintingrdquo IEEE Transactions onSignal Processing vol 61 no 1 pp 10ndash21 2013

[5] P Luo et al ldquoExperimental demonstration of RGB LED-basedoptical camera communicationsrdquo IEEE Photonics Journal vol 7no 5 pp 1ndash12 2015

[6] R D Roberts ldquoUndersampled frequency shift ON-OFF key-ing (UFSOOK) for camera communications (CamCom)rdquo inProceedings of the 22nd Wireless and Optical CommunicationsConference WOCC 2013 pp 645ndash648 China May 2013

[7] P Luo Z Ghassemlooy H L Minh H Tsai and X TangldquoUndersampled-PAM with subcarrier modulation for cameracommunicationsrdquo in Proceedings of the 20th Opto-Electronicsand Communications Conference (OECC) pp 1ndash3 ShanghaiChina 2015

[8] Y-Y Won S-H Yang D-H Kim and S-K Han ldquoThree-dimensional optical wireless indoor positioning system usinglocation code map based on power distribution of visible lightemitting dioderdquo IET Optoelectronics vol 7 no 3 pp 77ndash832013

[9] C Yang andH-R Shao ldquoWiFi-based indoor positioningrdquo IEEECommunications Magazine vol 53 no 3 pp 150ndash157 2015

12 Wireless Communications and Mobile Computing

[10] R Xu W Chen Y Xu and S Ji ldquoA New Indoor PositioningSystem Architecture Using GPS Signalsrdquo Sensors vol 15 no 5pp 10074ndash10087 2015

[11] Z Zhou M Kavehrad and P Deng ldquoIndoor positioning algo-rithm using light-emitting diode visible light communicationsrdquoOptical Engineering vol 51 no 8 2012

[12] W Zhang M I S Chowdhury and M Kavehrad ldquoAsyn-chronous indoor positioning system based on visible lightcommunicationsrdquo Optical Engineering vol 53 no 4 Article ID045105 2014

[13] S-Y Jung S Hann S Park and C-S Park ldquoOptical wirelessindoor positioning system using light emitting diode ceilinglightsrdquo Microwave and Optical Technology Letters vol 54 no7 pp 1622ndash1626 2012

[14] H-S Kim D-R Kim S-H Yang Y-H Son and S-K HanldquoAn indoor visible light communication positioning systemusing a RF carrier allocation techniquerdquo Journal of LightwaveTechnology vol 31 no 1 pp 134ndash144 2013

[15] K Panta and J Armstrong ldquoIndoor localisation using whiteLEDsrdquo IEEE Electronics Letters vol 48 no 4 pp 228ndash230 2012

[16] A D Cheok and L Yue ldquoA novel light-sensor-based informa-tion transmission system for indoor positioning and naviga-tionrdquo IEEE Transactions on Instrumentation and Measurementvol 60 no 1 pp 290ndash299 2011

[17] M Nakajima and S Haruyama ldquoNew indoor navigation systemfor visually impaired people using visible light communicationrdquoEURASIP Journal onWireless Communications and Networkingvol 2013 article 37 2013

[18] H Huang A Yang L Feng G Ni and P Guo ldquoIndoorPositioning Method Based on Metameric White Light Sourcesand Subpixels on aColor Image Sensorrdquo IEEEPhotonics Journalvol 8 no 6 2016

[19] M S Hossen Y Park and K Kim ldquoPerformance improvementof indoor positioning using light-emitting diodes and an imagesensor for light-emitting diode communicationrdquo Optical Engi-neering vol 54 no 3 pp 1ndash11 2015

[20] M-G Moon S-I Choi J Park and J Y Kim ldquoIndoorpositioning system using LED lights and a dual image sensorrdquoJournal of the Optical Society of Korea vol 19 no 6 pp 586ndash5912015

[21] Y-S Kuo P Pannuto K-J Hsiao and P Dutta ldquoLuxaposeIndoor positioning with mobile phones and visible lightrdquo inProceedings of the 20th ACM Annual International ConferenceonMobile Computing and Networking MobiCom 2014 pp 447ndash458 USA September 2014

[22] P Huynh and M Yoo ldquoVLC-based positioning system for anindoor environment using an image sensor and an accelerome-ter sensorrdquo Sensors vol 16 no 6 2016

[23] H Steendam T QWang and J Armstrong ldquoTheoretical lowerbound for indoor visible light positioning using received signalstrengthmeasurements and an aperture-based receiverrdquo Journalof Lightwave Technology vol 35 no 2 pp 309ndash319 2017

[24] F Yin Y Zhao F Gunnarsson and F Gustafsson ldquoReceived-signal-strength threshold optimization using Gaussian pro-cessesrdquo IEEE Transactions on Signal Processing vol 65 no 8pp 2164ndash2177 2017

[25] X Piao Y Zhang T Li et al ldquoRSS fingerprint based indoorlocalization using sparse representation with spatio-temporalconstraintrdquo Sensors vol 16 no 11 2016

[26] C Feng W S A Au S Valaee and Z Tan ldquoReceived-signal-strength-based indoor positioning using compressive sensingrdquo

IEEETransactions onMobile Computing vol 11 no 12 pp 1983ndash1993 2012

[27] W Gu W Zhang M Kavehrad and L Feng ldquoThree-dimensional light positioning algorithm with filtering tech-niques for indoor environmentsrdquo Optical Engineering vol 53no 10 2014

[28] S-H Yang E-M Jung and S-K Han ldquoIndoor locationestimation based on LED visible light communication usingmultiple optical receiversrdquo IEEE Communications Letters vol17 no 9 pp 1834ndash1837 2013

[29] F I K Mousa H Le-Minh Z Ghassemlooy et al ldquoIndoorlocalization system utilizing two visible light emitting diodesrdquoOptical Engineering vol 55 no 11 2016

[30] K Wang A Nirmalathas C Lim K Alameh and E SkafidasldquoOptical wireless-based indoor localization system employing asingle-channel imaging receiverrdquo Journal of Lightwave Technol-ogy vol 34 no 4 pp 1141ndash1149 2016

[31] S-H Jung B-CMoon andDHan ldquoUnsupervised learning forcrowdsourced indoor localization in wireless networksrdquo IEEETransactions on Mobile Computing vol 15 no 11 pp 2892ndash2906 2016

[32] X Wang L Gao S Mao and S Pandey ldquoCSI-Based Finger-printing for Indoor Localization A Deep Learning ApproachrdquoIEEE Transactions on Vehicular Technology vol 66 no 1 pp763ndash776 2017

[33] Y Hu andG Leus ldquoRobust differential received signal strength-based localizationrdquo IEEE Transactions on Signal Processing vol65 no 12 pp 3261ndash3276 2017

[34] T-H Do and M Yoo ldquoTDOA-based indoor positioning usingvisible lightrdquo Photonic Network Communications vol 27 no 2pp 80ndash88 2014

[35] S-Y Jung S Hann and C-S Park ldquoTDOA-based opticalwireless indoor localization using LED ceiling lampsrdquo IEEETransactions on Consumer Electronics vol 57 no 4 pp 1592ndash1597 2011

[36] U Nadeem N U Hassan M A Pasha and C Yuen ldquoHighlyaccurate 3D wireless indoor positioning system using whiteLED lightsrdquo IEEE Electronics Letters vol 50 no 11 pp 828ndash8302014

[37] TQWang Y A Sekercioglu ANeild and J Armstrong ldquoPosi-tion accuracy of time-of-arrival based ranging using visiblelight with application in indoor localization systemsrdquo Journalof Lightwave Technology vol 31 no 20 pp 3302ndash3308 2013

[38] O Ennasr G Xing and X Tan ldquoDistributed time-difference-of-arrival (TDOA)-based localization of a moving targetrdquo inProceedings of the 55th IEEEConference onDecision andControlCDC 2016 pp 2652ndash2658 USA December 2016

[39] S H Yang E M Jeong and S K Han ldquoIndoor positioningbased on received optical power difference by angle of arrivalrdquoIEEE Electronics Letters vol 50 no 1 pp 49ndash51 2014

[40] A Arafa X Jin and R Klukas ldquoWireless indoor opticalpositioning with a differential photosensorrdquo IEEE PhotonicsTechnology Letters vol 24 no 12 pp 1027ndash1029 2012

[41] M T Hossan M Z Chowdhury M K Hasan et al ldquoA NewVehicle Localization SchemeBased onCombinedOptical Cam-era Communication and PhotogrammetryrdquoMobile InformationSystems vol 2018 Article ID 8501898 14 pages 2018

[42] S M Berman D S Greenhouse I L Bailey R D Clearand T W Raasch ldquoHuman electroretinogram responses tovideo displays fluorescent lighting and other high frequencysourcesrdquo Optometry and Vision Science vol 68 no 8 pp 645ndash662 1991

Wireless Communications and Mobile Computing 13

[43] H-Y Lee H-M Lin Y-L Wei H-I Wu H-M Tsai andK C-J Lin ldquoRollingLight Enabling line-of-sight light-to-camera communicationsrdquo in Proceedings of the 13th AnnualInternational Conference on Mobile Systems Applications andServices MobiSys 2015 pp 167ndash180 Italy May 2015

[44] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000

[45] HAMartins J R Birk andR B Kelley ldquoCameramodels basedon data from two calibration planesrdquo Computer Graphics andImage Processing vol 17 no 2 pp 173ndash180 1981

[46] E Arthurs andH Dym ldquoOn the OptimumDetection of DigitalSignals in the Presence ofWhite Gaussian NoisemdashAGeometricInterpretation and a Study of Three Basic Data TransmissionSystemsrdquo IRE Transactions on Communications Systems vol 10no 4 pp 336ndash372 1962

International Journal of

AerospaceEngineeringHindawiwwwhindawicom Volume 2018

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Active and Passive Electronic Components

VLSI Design

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Shock and Vibration

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawiwwwhindawicom

Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Control Scienceand Engineering

Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

SensorsJournal of

Hindawiwwwhindawicom Volume 2018

International Journal of

RotatingMachinery

Hindawiwwwhindawicom Volume 2018

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Navigation and Observation

International Journal of

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

Submit your manuscripts atwwwhindawicom

Page 6: An Implementation Approach and Performance …...Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md.Shahjalal ,Md.TanvirHossan ,Moh.KhalidHasan

6 Wireless Communications and Mobile Computing

Detect the LEDs withincamera FoV and define as N

Get LED co-ordinates

Measure horizontaldistance to that

LED

Pass anydistance

N ge 3

Compare co-ordinates with thetarget co-ordinate

Find the nearest LEDco-ordinate to gotarget co-ordinate

Pass thatdistance

Measure thehorizontal distance to

that LED

Are the distancesame

Set target co-ordinate to MR

Pass theshortestdistance

Navigationterminated

Target co-Ordinate found

Pass that distance

Measure thehorizontal distance

to both LED

Get LEDco-ordinates

Target co-ordinate found

Yes

No

Yes

No

Yes

Yes

No

No

Figure 5 The mobile robot navigation flowchart

obtaining the direct distance we can easily find the horizontaldistance s using Pythagoras distance formula for right angletriangle as the vertical distance is fixed inside a room Aftercalculating the value of s and understanding the directionto move MR navigates this distance and fixes its locationto (x y) location If the target location is not matched withany of the received location ID the MR estimates the nearestLED that is required to reach the destination Subsequentlythe MR traverses a certain amount of distance to go apoint that is exactly below that particular LED Furtherthe MR again receives the locations of the LEDs that arewithin camerarsquos FOV Subsequently the procedure continuesuntil the destination is reached If the camera detects twoLEDs within its FOV the MR cannot detect the locationdistribution systemTherefore it initially chooses one ID andgoes to a point that is located exactly below that LED andthen MR continues with the aforementioned procedure

52 Camera Field of View We considered the indoor envi-ronment to possess a uniform square distribution of LEDsFigure 7 depicts the manner in which the number of LEDswithin camerarsquos FOV depends on different factors such asthe height of the room the distance between two adjacentLEDs and the angle of view (AOV) ie the FOV angle Forthe purpose of navigation the MR should understand thelocation distribution system among the LEDs It is obvious

that at least two diagonally located LED locations should beconsidered to make a decision about the direction in whichthe X and Y locations are increasing or decreasing A relationhas been derived to estimate the number of LEDs that werecaptured within the FOV at any time considering the cameraview angle the height of the roof and the surface area takenby a single LED

The horizontal and vertical FOV can be estimated usingthe following two equations

120601ℎ = 2tanminus1 119889ℎ2119891 (1)

120601V = 2tanminus1 119889V2119891 (2)

where 120601V and 120601ℎ are the FOV in vertical and horizontaldirections and 119889ℎ 119889V are the sensor dimensions respectively119891 represents the focal length of the camera lens Figure 7shows the camera FOV and its area is projected vertically onthe roof of height ℎ119903 We can represent the area of the FOVusing the following equation

119860119865119874119881 = 4ℎ1199032 tan (120601V) tan (120601ℎ) (3)Because the LEDs are distributed in a square pattern and

each LED separated by a distance 119886 along each side (where119886 gt LED diameter) then each LED will occupy an area of 1198862

Wireless Communications and Mobile Computing 7

TransmittedSignals

Cameramounted MR

Final position (xy)

D

s

h

sInitial position

LED(xy)

Figure 6 Calculation procedure of the distance to move horizon-tally

FOV

Camera

ℎr

2ℎr

tan(

)

2ℎr tan(

ℎ )

Figure 7 Analysis of the camerarsquos FOV

Now we can find the number of LEDs will be in thecamera FOV using (3)

119873119871119864119863 = 4ℎ1199032 tan (120601V) tan (120601ℎ)1198862 (4)

53 Exploring the ID Distribution System The smart phonethat was used to perform this study was a Samsung S7 edgewhich contained a 26-mm camera and a focal length of42 mm The height of the room that was observed usingthe camera was 256 m We set up the android camerawith a height and width of 600 and 800 px respectivelyIn a general scenario there were always at least four LEDswithin the camerarsquos FOV under this type of considerationTo define the coordination distribution system we divided

LF region FR region

RB regionBL region

(89) (8292)

(89) (8292)

Figure 8 Region basis locations selection and exploring theirdistribution system

the IS into four regions ie left-front region (LFR) front-right region (FRR) right-back region (RBR) and back-leftregion (BLR) as depicted in Figure 8 where each regioncontains an equal number of pixels The camera will detect asingle LED per region and receive the location ID as well Wedefined the locations on a regional basis which are (XLFYLF)(XFRYFR) (XRBYRB) and (XBLYBL) respectively within thefour regions It is obvious that the LEDs will be present eitherin both LF and RB regions or in both FR and BL regions IfXLF lt XRB or XFR gt XBL the x-location increases in theright direction otherwise it increases in the left directionFurther if YLF gt YRB or YFR gt YBL the y-location increasesin the front direction otherwise it increases in the back direc-tion

6 Performance Analysis

61 Measurement of Distance Accuracy The direct distancefrom the camera to an LED can be measured by consideringa few distance parameters If D is the direct distance of theLED (119909119894 119910119895) from the camera lens and d is the distance of theformed image on the sensor from the camera lens then wecan write from the general lens equation

1119863 + 1119889 = 1119865 (5)

119889119863 = 119865119863 minus 119865 (6)

where F is the focal length of the lens Here magnificationfactor is required to introduce and compare the relative sizeof the actual size and the detected size on the IS

The magnification factor m can be written as follows

119898 = radic 119886119894119895119860 119894119895 =119889119863 (7)

where119860 119894119895 is the actual surface area of the nearest LED (119909119894 119910119895)and the area covered by the detected image on the IS is 119886119894119895

8 Wireless Communications and Mobile Computing

Table 1 Distance measurement data from an LED at different floor positions

Horizontaldistance dh(cm)

Measureddirect distance

(cm)

Actual directdistance (cm)

Error in distancemeasurement

(cm)

Detected area(px)

1st Radius119903 (px)

2nd Radius1199031015840 (px)

00 2555 2555 0 348

11

1110 25588 2559 002 347 1020 25665 2567 005 341 1030 2576 2577 01 337 1040 25885 259 015 336 1050 260 2602 02 330 960 2625 2628 03 324 970 2643 2647 04 314 980 2671 2677 06 306 990 2695 27061 111 298 8100 2725 2745 2 288 8

Normally it is observed that FltltD therefore we cancombine the above equation as follows

119886119894119895 = 1198982119860 119894119895 (8)

119886119894119895 = 11986521198632119860 119894119895 (9)

119863 = 119865radic119860 119894119895119886119894119895 (10)

where the focal length F and the actual LED size 119860 119894119895 areknown The only requirement is to obtain the detected LEDarea 119886119894119895 from the IS

Using (10) the distance can be measured accurately ata position at which the camera is vertically in the LOS ofthe LED But when it navigates through some other placeswhich are not just below the LED can be experienced someerrors during this measurement Because at that time theshape of the circular LED will be seemed like a ellipse tothe camera This happens due to the radial distortion ofthe camera [44] The wide-angle camera short focal-lengthscamera or the fisheye camera are the main reason behindthe radial distortion Determining intrinsic extrinsic anddistortion coefficient of IS and lens of the camera is requiredto omit the effect of radial distortion This radial distortionmakes a significant transformation of the projected objectfrom the actual object with a normal focal length To verifythe localization scheme a back-projection method deploysto measure area of LED from distorted image from read-world space [45] In the following section we will exhibit afigure to illustrate the accuracy of the distance measurementAlthough at the boundary position the accuracy remainsgreater than 98 we have undertaken one further step tomitigate this error Let us take 119903 as the radius of circular LEDand the area will be 1205871199032 Now when circular shape changesto eclipse one radius 119903 reduces to 1199031015840 and the area becomes1205871199031199031015840 Table 1 presents the data measured from a LED witha size of 71 cm2 located at a ceiling of height of 256 m Thepixels allocated for the detected LED at different positions are

975

980

985

990

995

1000

Perc

enta

ge o

f acc

urac

y

10 20 30 40 50 60 70 80 90 1000Horizontal distance (cm)

Figure 9 Percentage of accuracy of the measured direct distancesfrom the camera to the LED at different horizontal points

recorded to show at far distance detected contour becomessmaller than nearer We tabulated the radius of eclipse whichis required to calculate the actual size of eclipse and correctthe positioning error

This radius data is integrated within the application andwill be called when this radius will be measured by smart-phone through image processing Comparing the existingdata with the actual area of the LED can predict to measurethe direct distance

The distance between the LED and the camera changeswhen the MR moves horizontally The direct distance from aparticular LED is required to calculate the horizontal distancethat is required to move and to get the floor location ofthat particular LED We used (10) to measure the directdistance between the camera and the LED We graphicallyrepresented the percentage of accuracy that is observed whilemeasuring the direct distance at a particular point on the floorin Figure 9 The distance is measured with 100 accuracywhen the camera is situated directly below the LEDWhen the

Wireless Communications and Mobile Computing 9

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

20 40 60 80 1000Horizontal distance (cm)

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

(a)

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

20 40 60 80 1000Horizontal distance (cm)

(b)

Figure 10 Distance error comparison in terms of variation of no of LEDs inside the camera FOV at different horizontal points (a) Inter-LEDdistance = 50 cm and (b) inter-LED distance = 100 cm

camera is shifted horizontally far away from the floor locationof the LED the accuracy is reduced

To evaluate the performance of the proposed localizationalgorithm we have tested the accuracy of measuring thedistance for different type of scenarios We have got thedistance measurement error of 2 cm maximum in case ofsingle LED inside the camera FOV However when thereare multiple LEDs inside the camera FOV the distance errorincreases as the probability of LED being at the middle ofthe IS reduces Figure 10 depicts the comparison of errorin measuring the distance between LEDs and camera fordifferent cases varying the number of LEDs into the cameraFOV We have taken two scenarios first is for the case wherethe inter-LED distance is 50 cm and second is for the casewhere it is 100 cm In each scenario we have considered 2to 4 LEDs for the comparison We started measuring fromthe floor position of an LED and took 10 more reading goingaway about 10 cm in each step We can see from the figurethat if there are more LEDs into the IS at the same time theerror formeasuring the distance increasesThis is because thechance of an LED to be detected at the center of IS is beingreduced accordingly As our system comprises LEDs that aredistributed in a rectangular pattern there is more possibilityof being 4 LEDs at a time in the IS And for this case we haveto consider up to 75 cm distance error for positioning andnavigation purpose when the inter-LED distance is 50 cmBut for the second case described in Figure 10(b) the valueof the distance error increases up to 85 cm for 4 LEDs due toincrease of the inter-LED spacing to 100 cm

Though the distance error reducing with the reducednumber of LEDs being allowed into the camera FOV it isrequired to have more than two LEDs into the camera FOVto get the navigation information and we should accept thiserror anyway Also we have described a way in this sectionthat can be applied to reduce the distance error to a certainlevel for moving to a horizontal distance to reach the targetLED location increasing the positioning performance

2-FSK4-FSK8-FSK

2 3 4 5 6 7 8 9 10 111Signal to noise ratio (dB)

10minus5

10minus4

10minus3

10minus2

10minus1

100

Bit-e

rror

rate

Figure 11 BER comparison forM-FSK (M=248) in indoor wirelessRician fading channel

62 Bit Error Rate for MFSK To evaluate bit error rate(BER) performance we considered an indoor wireless Ricianfading channel MFSK is simple and flexible for multilinkcommunication over this channel It is known [46] that thebit error probability for MFSK is

120588119904 = 1119872119872sum119894=2

(minus1)119894 (119872119894 ) exp [minus (1 minus1119894 ) 120588] (11)

where M is the number of orders and 120588is the instantaneoussignal-to-noise ratio (SNR) value In Figure 11 we depicteda BER comparison graph between different modulation

10 Wireless Communications and Mobile Computing

50 cm

50 cm

50 cm

50 cm

$ =71 cG2

(a) (b)

Figure 12 Experimental setup (a) Four LEDs that are 50 cm apart from each other transmit the location IDs (b) Captured image after settingthe exposure time to 18000 sec

orders (M=248) We have seen for increasing M bit errorprobability decreasesWhen the transmission is done keepingthe same bit energy which means that the symbol durationbecomes double for the case of 4-FSK compared to 2-FSKmaintaining the same bit rate meanwhile In such case theBER becomes lower as symbol duration increases with highermodulation index

7 Demonstration Results

We used a platform to demonstrate our OCC system toevaluate its performance To understand the ID distributionsystem we should receive the location information using aminimum of two diagonally located LEDs that are describedin Section 5 The platform comprises four LED transmit-ter links that are placed in a square shape as depictedin Figure 12(a) The LEDs are arranged by maintaining adistance of 50 cm between them For better explanationwe considered these transmitters to be a single unit Thesefour transmitters are linked continuously and transmit fourdifferent locations which are encoded with the help of a PCand an LED driver circuit

To focus only on the target-transmitting LED we con-trolled the camera exposure time The exposure time is theduration at which each pixel is exposed to light Using thisprocess we can obtain the dark and white strips at a regionat which the LEDs are captured by darkening all the partsof the image except the parts that depict the target LEDsFigure 12(b) depicts the captured image using the SamsungS7 (edge) camera of the transmitting LED unit after settingthe camerarsquos CaptureRequestSENSORE EXPOSURE TIMEto 18000 secWe can observe that all the four LEDs representthe same dark and bright strip width This is because samemodulation frequencies were used for all the four transmit-ters and the frequencies ranged from 2 to 4 kHz Generally

the characteristics of the rolling image are such that the stripscan be formed horizontally However in our case we rotatedthe image using imageView at 270∘ and this is the reason forthe observation of vertically formed strips

To demodulate the data from the detected strip patternwe should initially measure the width of each dark and whitestrip We know different camera has different frame rate andalso for a fixed camera frame rate fluctuates within a shortrange For this reason the width also varies and as the widthis measured in a scale of number of pixels in a row it shouldbe an integer value In our system we observed brighter stripsat which the width varied between 5 and 7 pixels whereas itvaried between 1 and 3 pixels for darker strips Therefore themidvalue was considered to be four to distinguish betweendark and white patterns When the measured width wasgreater than four smart phone will consider the LED to bein the ON state and store a binary lsquo1rsquo value Otherwise theLED is assumed to be in the OFF state and stores a lsquo0rsquo valueFigure 13 illustrates the android implementation software thatcan simultaneously receive data from four LEDsWe receivedfive bits of ID per LED and defined each combination ofthe received binary ID to a fixed location in the applicationsystem The table in Figure 13 depicts the manner in whichthe location is assigned to different IDs To receive multipleLEDs the overall time for processing one frame should bewithin the critical frame time Otherwise the applicationmayunfortunately be stoppedThe critical frame time is defined asthe maximum processing time for one frame In our systemwe used 20 fps to limit the critical frame time to 50 msec Weadd a starting symbol to recognize when the data is startingSo the time to get the data obviously depends on the timingof getting the frame containing that symbol We test in real-time debugging the software in android studio We observethe timewhich is required for initialization frameprocessingdemodulating and device responseWefigure out that almost

Wireless Communications and Mobile Computing 11

CoordinateBits(11)00000(12)00001(21)00010(22)00011(13)00100

Figure 13 Implemented android application receiving multipleLEDs ID simultaneously

1000 ms is required to debug and getting device responsewhen communicating with 4 LEDs simultaneously As eachLED is transmitting 10 bits we got data at 40 bps rate for 4LEDs with this device

8 Conclusion

This work focuses on the reception of data from multipleindoor LED transmitters as well as on the localization andnavigation of the MR processing the received data Theindoor LED locations are received as binary data of five bitsper link at a data rate of 40 bps The proposed algorithmwas implemented in an indoor platform where four LEDshaving sizes of 71 cm2 were kept at a distance of 50 cmfrom each other The four LEDs transmit four different IDsand an android application that was developed to receivethese IDs performed well simultaneously We were able tocommunicate with the LEDs from a maximum distance of480 cm As our positioning approach requires measurementof the direct distance from the camera to the LED wecompare the accuracy of measuring the distance at differentfloor positions Maintaining the floor height to be constantwe experience an increase in the percentage of error whenthe camera moves a long distance horizontally To mitigatethis error we fix some recorded distance values within theapplication to compareWe have foundmaximum 2 cm errorat 100 cm horizontal distance from the floor position of thetarget LED when the MR moves horizontally toward thatLED Thus the accuracy is improved from the localization

and navigation viewpoint as well as based on the distance forindoor communication

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that they have no conflicts of interest

Acknowledgments

This research was supported by the MSIT (Ministry ofScience and ICT) Korea under the Global IT Talent sup-port program (IITP-2017-0-01806) supervised by the IITP(Institute for Information and Communication Technol-ogy Promotion) and the National Research Foundationof Korea (NRF) grant funded by the Korea government(Performance Enhancement and Resource Management ofVisual MIMO-based Camera Communication Networks)(No NRF-2016R1D1A1A09919091)

References

[1] M Z Chowdhury M T Hossan A Islam and Y M Jang ldquoAComparative Survey of Optical Wireless Technologies Archi-tectures and Applicationsrdquo IEEE Access vol 6 pp 9819ndash98402018

[2] M T Hossan M Z Chowdhury A Islam and Y M JangldquoA Novel Indoor Mobile Localization System Based on Opti-cal Camera Communicationrdquo Wireless Communications andMobile Computing vol 2018 17 pages 2018

[3] A Khalajmehrabadi N Gatsis and D Akopian ldquoMod-ern WLAN Fingerprinting Indoor Positioning Methods andDeployment Challengesrdquo IEEE Communications Surveys ampTutorials vol 19 no 3 pp 1974ndash2002 2017

[4] E Kupershtein M Wax and I Cohen ldquoSingle-site emitterlocalization via multipath fingerprintingrdquo IEEE Transactions onSignal Processing vol 61 no 1 pp 10ndash21 2013

[5] P Luo et al ldquoExperimental demonstration of RGB LED-basedoptical camera communicationsrdquo IEEE Photonics Journal vol 7no 5 pp 1ndash12 2015

[6] R D Roberts ldquoUndersampled frequency shift ON-OFF key-ing (UFSOOK) for camera communications (CamCom)rdquo inProceedings of the 22nd Wireless and Optical CommunicationsConference WOCC 2013 pp 645ndash648 China May 2013

[7] P Luo Z Ghassemlooy H L Minh H Tsai and X TangldquoUndersampled-PAM with subcarrier modulation for cameracommunicationsrdquo in Proceedings of the 20th Opto-Electronicsand Communications Conference (OECC) pp 1ndash3 ShanghaiChina 2015

[8] Y-Y Won S-H Yang D-H Kim and S-K Han ldquoThree-dimensional optical wireless indoor positioning system usinglocation code map based on power distribution of visible lightemitting dioderdquo IET Optoelectronics vol 7 no 3 pp 77ndash832013

[9] C Yang andH-R Shao ldquoWiFi-based indoor positioningrdquo IEEECommunications Magazine vol 53 no 3 pp 150ndash157 2015

12 Wireless Communications and Mobile Computing

[10] R Xu W Chen Y Xu and S Ji ldquoA New Indoor PositioningSystem Architecture Using GPS Signalsrdquo Sensors vol 15 no 5pp 10074ndash10087 2015

[11] Z Zhou M Kavehrad and P Deng ldquoIndoor positioning algo-rithm using light-emitting diode visible light communicationsrdquoOptical Engineering vol 51 no 8 2012

[12] W Zhang M I S Chowdhury and M Kavehrad ldquoAsyn-chronous indoor positioning system based on visible lightcommunicationsrdquo Optical Engineering vol 53 no 4 Article ID045105 2014

[13] S-Y Jung S Hann S Park and C-S Park ldquoOptical wirelessindoor positioning system using light emitting diode ceilinglightsrdquo Microwave and Optical Technology Letters vol 54 no7 pp 1622ndash1626 2012

[14] H-S Kim D-R Kim S-H Yang Y-H Son and S-K HanldquoAn indoor visible light communication positioning systemusing a RF carrier allocation techniquerdquo Journal of LightwaveTechnology vol 31 no 1 pp 134ndash144 2013

[15] K Panta and J Armstrong ldquoIndoor localisation using whiteLEDsrdquo IEEE Electronics Letters vol 48 no 4 pp 228ndash230 2012

[16] A D Cheok and L Yue ldquoA novel light-sensor-based informa-tion transmission system for indoor positioning and naviga-tionrdquo IEEE Transactions on Instrumentation and Measurementvol 60 no 1 pp 290ndash299 2011

[17] M Nakajima and S Haruyama ldquoNew indoor navigation systemfor visually impaired people using visible light communicationrdquoEURASIP Journal onWireless Communications and Networkingvol 2013 article 37 2013

[18] H Huang A Yang L Feng G Ni and P Guo ldquoIndoorPositioning Method Based on Metameric White Light Sourcesand Subpixels on aColor Image Sensorrdquo IEEEPhotonics Journalvol 8 no 6 2016

[19] M S Hossen Y Park and K Kim ldquoPerformance improvementof indoor positioning using light-emitting diodes and an imagesensor for light-emitting diode communicationrdquo Optical Engi-neering vol 54 no 3 pp 1ndash11 2015

[20] M-G Moon S-I Choi J Park and J Y Kim ldquoIndoorpositioning system using LED lights and a dual image sensorrdquoJournal of the Optical Society of Korea vol 19 no 6 pp 586ndash5912015

[21] Y-S Kuo P Pannuto K-J Hsiao and P Dutta ldquoLuxaposeIndoor positioning with mobile phones and visible lightrdquo inProceedings of the 20th ACM Annual International ConferenceonMobile Computing and Networking MobiCom 2014 pp 447ndash458 USA September 2014

[22] P Huynh and M Yoo ldquoVLC-based positioning system for anindoor environment using an image sensor and an accelerome-ter sensorrdquo Sensors vol 16 no 6 2016

[23] H Steendam T QWang and J Armstrong ldquoTheoretical lowerbound for indoor visible light positioning using received signalstrengthmeasurements and an aperture-based receiverrdquo Journalof Lightwave Technology vol 35 no 2 pp 309ndash319 2017

[24] F Yin Y Zhao F Gunnarsson and F Gustafsson ldquoReceived-signal-strength threshold optimization using Gaussian pro-cessesrdquo IEEE Transactions on Signal Processing vol 65 no 8pp 2164ndash2177 2017

[25] X Piao Y Zhang T Li et al ldquoRSS fingerprint based indoorlocalization using sparse representation with spatio-temporalconstraintrdquo Sensors vol 16 no 11 2016

[26] C Feng W S A Au S Valaee and Z Tan ldquoReceived-signal-strength-based indoor positioning using compressive sensingrdquo

IEEETransactions onMobile Computing vol 11 no 12 pp 1983ndash1993 2012

[27] W Gu W Zhang M Kavehrad and L Feng ldquoThree-dimensional light positioning algorithm with filtering tech-niques for indoor environmentsrdquo Optical Engineering vol 53no 10 2014

[28] S-H Yang E-M Jung and S-K Han ldquoIndoor locationestimation based on LED visible light communication usingmultiple optical receiversrdquo IEEE Communications Letters vol17 no 9 pp 1834ndash1837 2013

[29] F I K Mousa H Le-Minh Z Ghassemlooy et al ldquoIndoorlocalization system utilizing two visible light emitting diodesrdquoOptical Engineering vol 55 no 11 2016

[30] K Wang A Nirmalathas C Lim K Alameh and E SkafidasldquoOptical wireless-based indoor localization system employing asingle-channel imaging receiverrdquo Journal of Lightwave Technol-ogy vol 34 no 4 pp 1141ndash1149 2016

[31] S-H Jung B-CMoon andDHan ldquoUnsupervised learning forcrowdsourced indoor localization in wireless networksrdquo IEEETransactions on Mobile Computing vol 15 no 11 pp 2892ndash2906 2016

[32] X Wang L Gao S Mao and S Pandey ldquoCSI-Based Finger-printing for Indoor Localization A Deep Learning ApproachrdquoIEEE Transactions on Vehicular Technology vol 66 no 1 pp763ndash776 2017

[33] Y Hu andG Leus ldquoRobust differential received signal strength-based localizationrdquo IEEE Transactions on Signal Processing vol65 no 12 pp 3261ndash3276 2017

[34] T-H Do and M Yoo ldquoTDOA-based indoor positioning usingvisible lightrdquo Photonic Network Communications vol 27 no 2pp 80ndash88 2014

[35] S-Y Jung S Hann and C-S Park ldquoTDOA-based opticalwireless indoor localization using LED ceiling lampsrdquo IEEETransactions on Consumer Electronics vol 57 no 4 pp 1592ndash1597 2011

[36] U Nadeem N U Hassan M A Pasha and C Yuen ldquoHighlyaccurate 3D wireless indoor positioning system using whiteLED lightsrdquo IEEE Electronics Letters vol 50 no 11 pp 828ndash8302014

[37] TQWang Y A Sekercioglu ANeild and J Armstrong ldquoPosi-tion accuracy of time-of-arrival based ranging using visiblelight with application in indoor localization systemsrdquo Journalof Lightwave Technology vol 31 no 20 pp 3302ndash3308 2013

[38] O Ennasr G Xing and X Tan ldquoDistributed time-difference-of-arrival (TDOA)-based localization of a moving targetrdquo inProceedings of the 55th IEEEConference onDecision andControlCDC 2016 pp 2652ndash2658 USA December 2016

[39] S H Yang E M Jeong and S K Han ldquoIndoor positioningbased on received optical power difference by angle of arrivalrdquoIEEE Electronics Letters vol 50 no 1 pp 49ndash51 2014

[40] A Arafa X Jin and R Klukas ldquoWireless indoor opticalpositioning with a differential photosensorrdquo IEEE PhotonicsTechnology Letters vol 24 no 12 pp 1027ndash1029 2012

[41] M T Hossan M Z Chowdhury M K Hasan et al ldquoA NewVehicle Localization SchemeBased onCombinedOptical Cam-era Communication and PhotogrammetryrdquoMobile InformationSystems vol 2018 Article ID 8501898 14 pages 2018

[42] S M Berman D S Greenhouse I L Bailey R D Clearand T W Raasch ldquoHuman electroretinogram responses tovideo displays fluorescent lighting and other high frequencysourcesrdquo Optometry and Vision Science vol 68 no 8 pp 645ndash662 1991

Wireless Communications and Mobile Computing 13

[43] H-Y Lee H-M Lin Y-L Wei H-I Wu H-M Tsai andK C-J Lin ldquoRollingLight Enabling line-of-sight light-to-camera communicationsrdquo in Proceedings of the 13th AnnualInternational Conference on Mobile Systems Applications andServices MobiSys 2015 pp 167ndash180 Italy May 2015

[44] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000

[45] HAMartins J R Birk andR B Kelley ldquoCameramodels basedon data from two calibration planesrdquo Computer Graphics andImage Processing vol 17 no 2 pp 173ndash180 1981

[46] E Arthurs andH Dym ldquoOn the OptimumDetection of DigitalSignals in the Presence ofWhite Gaussian NoisemdashAGeometricInterpretation and a Study of Three Basic Data TransmissionSystemsrdquo IRE Transactions on Communications Systems vol 10no 4 pp 336ndash372 1962

International Journal of

AerospaceEngineeringHindawiwwwhindawicom Volume 2018

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Active and Passive Electronic Components

VLSI Design

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Shock and Vibration

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawiwwwhindawicom

Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Control Scienceand Engineering

Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

SensorsJournal of

Hindawiwwwhindawicom Volume 2018

International Journal of

RotatingMachinery

Hindawiwwwhindawicom Volume 2018

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Navigation and Observation

International Journal of

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

Submit your manuscripts atwwwhindawicom

Page 7: An Implementation Approach and Performance …...Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md.Shahjalal ,Md.TanvirHossan ,Moh.KhalidHasan

Wireless Communications and Mobile Computing 7

TransmittedSignals

Cameramounted MR

Final position (xy)

D

s

h

sInitial position

LED(xy)

Figure 6 Calculation procedure of the distance to move horizon-tally

FOV

Camera

ℎr

2ℎr

tan(

)

2ℎr tan(

ℎ )

Figure 7 Analysis of the camerarsquos FOV

Now we can find the number of LEDs will be in thecamera FOV using (3)

119873119871119864119863 = 4ℎ1199032 tan (120601V) tan (120601ℎ)1198862 (4)

53 Exploring the ID Distribution System The smart phonethat was used to perform this study was a Samsung S7 edgewhich contained a 26-mm camera and a focal length of42 mm The height of the room that was observed usingthe camera was 256 m We set up the android camerawith a height and width of 600 and 800 px respectivelyIn a general scenario there were always at least four LEDswithin the camerarsquos FOV under this type of considerationTo define the coordination distribution system we divided

LF region FR region

RB regionBL region

(89) (8292)

(89) (8292)

Figure 8 Region basis locations selection and exploring theirdistribution system

the IS into four regions ie left-front region (LFR) front-right region (FRR) right-back region (RBR) and back-leftregion (BLR) as depicted in Figure 8 where each regioncontains an equal number of pixels The camera will detect asingle LED per region and receive the location ID as well Wedefined the locations on a regional basis which are (XLFYLF)(XFRYFR) (XRBYRB) and (XBLYBL) respectively within thefour regions It is obvious that the LEDs will be present eitherin both LF and RB regions or in both FR and BL regions IfXLF lt XRB or XFR gt XBL the x-location increases in theright direction otherwise it increases in the left directionFurther if YLF gt YRB or YFR gt YBL the y-location increasesin the front direction otherwise it increases in the back direc-tion

6 Performance Analysis

61 Measurement of Distance Accuracy The direct distancefrom the camera to an LED can be measured by consideringa few distance parameters If D is the direct distance of theLED (119909119894 119910119895) from the camera lens and d is the distance of theformed image on the sensor from the camera lens then wecan write from the general lens equation

1119863 + 1119889 = 1119865 (5)

119889119863 = 119865119863 minus 119865 (6)

where F is the focal length of the lens Here magnificationfactor is required to introduce and compare the relative sizeof the actual size and the detected size on the IS

The magnification factor m can be written as follows

119898 = radic 119886119894119895119860 119894119895 =119889119863 (7)

where119860 119894119895 is the actual surface area of the nearest LED (119909119894 119910119895)and the area covered by the detected image on the IS is 119886119894119895

8 Wireless Communications and Mobile Computing

Table 1 Distance measurement data from an LED at different floor positions

Horizontaldistance dh(cm)

Measureddirect distance

(cm)

Actual directdistance (cm)

Error in distancemeasurement

(cm)

Detected area(px)

1st Radius119903 (px)

2nd Radius1199031015840 (px)

00 2555 2555 0 348

11

1110 25588 2559 002 347 1020 25665 2567 005 341 1030 2576 2577 01 337 1040 25885 259 015 336 1050 260 2602 02 330 960 2625 2628 03 324 970 2643 2647 04 314 980 2671 2677 06 306 990 2695 27061 111 298 8100 2725 2745 2 288 8

Normally it is observed that FltltD therefore we cancombine the above equation as follows

119886119894119895 = 1198982119860 119894119895 (8)

119886119894119895 = 11986521198632119860 119894119895 (9)

119863 = 119865radic119860 119894119895119886119894119895 (10)

where the focal length F and the actual LED size 119860 119894119895 areknown The only requirement is to obtain the detected LEDarea 119886119894119895 from the IS

Using (10) the distance can be measured accurately ata position at which the camera is vertically in the LOS ofthe LED But when it navigates through some other placeswhich are not just below the LED can be experienced someerrors during this measurement Because at that time theshape of the circular LED will be seemed like a ellipse tothe camera This happens due to the radial distortion ofthe camera [44] The wide-angle camera short focal-lengthscamera or the fisheye camera are the main reason behindthe radial distortion Determining intrinsic extrinsic anddistortion coefficient of IS and lens of the camera is requiredto omit the effect of radial distortion This radial distortionmakes a significant transformation of the projected objectfrom the actual object with a normal focal length To verifythe localization scheme a back-projection method deploysto measure area of LED from distorted image from read-world space [45] In the following section we will exhibit afigure to illustrate the accuracy of the distance measurementAlthough at the boundary position the accuracy remainsgreater than 98 we have undertaken one further step tomitigate this error Let us take 119903 as the radius of circular LEDand the area will be 1205871199032 Now when circular shape changesto eclipse one radius 119903 reduces to 1199031015840 and the area becomes1205871199031199031015840 Table 1 presents the data measured from a LED witha size of 71 cm2 located at a ceiling of height of 256 m Thepixels allocated for the detected LED at different positions are

975

980

985

990

995

1000

Perc

enta

ge o

f acc

urac

y

10 20 30 40 50 60 70 80 90 1000Horizontal distance (cm)

Figure 9 Percentage of accuracy of the measured direct distancesfrom the camera to the LED at different horizontal points

recorded to show at far distance detected contour becomessmaller than nearer We tabulated the radius of eclipse whichis required to calculate the actual size of eclipse and correctthe positioning error

This radius data is integrated within the application andwill be called when this radius will be measured by smart-phone through image processing Comparing the existingdata with the actual area of the LED can predict to measurethe direct distance

The distance between the LED and the camera changeswhen the MR moves horizontally The direct distance from aparticular LED is required to calculate the horizontal distancethat is required to move and to get the floor location ofthat particular LED We used (10) to measure the directdistance between the camera and the LED We graphicallyrepresented the percentage of accuracy that is observed whilemeasuring the direct distance at a particular point on the floorin Figure 9 The distance is measured with 100 accuracywhen the camera is situated directly below the LEDWhen the

Wireless Communications and Mobile Computing 9

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

20 40 60 80 1000Horizontal distance (cm)

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

(a)

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

20 40 60 80 1000Horizontal distance (cm)

(b)

Figure 10 Distance error comparison in terms of variation of no of LEDs inside the camera FOV at different horizontal points (a) Inter-LEDdistance = 50 cm and (b) inter-LED distance = 100 cm

camera is shifted horizontally far away from the floor locationof the LED the accuracy is reduced

To evaluate the performance of the proposed localizationalgorithm we have tested the accuracy of measuring thedistance for different type of scenarios We have got thedistance measurement error of 2 cm maximum in case ofsingle LED inside the camera FOV However when thereare multiple LEDs inside the camera FOV the distance errorincreases as the probability of LED being at the middle ofthe IS reduces Figure 10 depicts the comparison of errorin measuring the distance between LEDs and camera fordifferent cases varying the number of LEDs into the cameraFOV We have taken two scenarios first is for the case wherethe inter-LED distance is 50 cm and second is for the casewhere it is 100 cm In each scenario we have considered 2to 4 LEDs for the comparison We started measuring fromthe floor position of an LED and took 10 more reading goingaway about 10 cm in each step We can see from the figurethat if there are more LEDs into the IS at the same time theerror formeasuring the distance increasesThis is because thechance of an LED to be detected at the center of IS is beingreduced accordingly As our system comprises LEDs that aredistributed in a rectangular pattern there is more possibilityof being 4 LEDs at a time in the IS And for this case we haveto consider up to 75 cm distance error for positioning andnavigation purpose when the inter-LED distance is 50 cmBut for the second case described in Figure 10(b) the valueof the distance error increases up to 85 cm for 4 LEDs due toincrease of the inter-LED spacing to 100 cm

Though the distance error reducing with the reducednumber of LEDs being allowed into the camera FOV it isrequired to have more than two LEDs into the camera FOVto get the navigation information and we should accept thiserror anyway Also we have described a way in this sectionthat can be applied to reduce the distance error to a certainlevel for moving to a horizontal distance to reach the targetLED location increasing the positioning performance

2-FSK4-FSK8-FSK

2 3 4 5 6 7 8 9 10 111Signal to noise ratio (dB)

10minus5

10minus4

10minus3

10minus2

10minus1

100

Bit-e

rror

rate

Figure 11 BER comparison forM-FSK (M=248) in indoor wirelessRician fading channel

62 Bit Error Rate for MFSK To evaluate bit error rate(BER) performance we considered an indoor wireless Ricianfading channel MFSK is simple and flexible for multilinkcommunication over this channel It is known [46] that thebit error probability for MFSK is

120588119904 = 1119872119872sum119894=2

(minus1)119894 (119872119894 ) exp [minus (1 minus1119894 ) 120588] (11)

where M is the number of orders and 120588is the instantaneoussignal-to-noise ratio (SNR) value In Figure 11 we depicteda BER comparison graph between different modulation

10 Wireless Communications and Mobile Computing

50 cm

50 cm

50 cm

50 cm

$ =71 cG2

(a) (b)

Figure 12 Experimental setup (a) Four LEDs that are 50 cm apart from each other transmit the location IDs (b) Captured image after settingthe exposure time to 18000 sec

orders (M=248) We have seen for increasing M bit errorprobability decreasesWhen the transmission is done keepingthe same bit energy which means that the symbol durationbecomes double for the case of 4-FSK compared to 2-FSKmaintaining the same bit rate meanwhile In such case theBER becomes lower as symbol duration increases with highermodulation index

7 Demonstration Results

We used a platform to demonstrate our OCC system toevaluate its performance To understand the ID distributionsystem we should receive the location information using aminimum of two diagonally located LEDs that are describedin Section 5 The platform comprises four LED transmit-ter links that are placed in a square shape as depictedin Figure 12(a) The LEDs are arranged by maintaining adistance of 50 cm between them For better explanationwe considered these transmitters to be a single unit Thesefour transmitters are linked continuously and transmit fourdifferent locations which are encoded with the help of a PCand an LED driver circuit

To focus only on the target-transmitting LED we con-trolled the camera exposure time The exposure time is theduration at which each pixel is exposed to light Using thisprocess we can obtain the dark and white strips at a regionat which the LEDs are captured by darkening all the partsof the image except the parts that depict the target LEDsFigure 12(b) depicts the captured image using the SamsungS7 (edge) camera of the transmitting LED unit after settingthe camerarsquos CaptureRequestSENSORE EXPOSURE TIMEto 18000 secWe can observe that all the four LEDs representthe same dark and bright strip width This is because samemodulation frequencies were used for all the four transmit-ters and the frequencies ranged from 2 to 4 kHz Generally

the characteristics of the rolling image are such that the stripscan be formed horizontally However in our case we rotatedthe image using imageView at 270∘ and this is the reason forthe observation of vertically formed strips

To demodulate the data from the detected strip patternwe should initially measure the width of each dark and whitestrip We know different camera has different frame rate andalso for a fixed camera frame rate fluctuates within a shortrange For this reason the width also varies and as the widthis measured in a scale of number of pixels in a row it shouldbe an integer value In our system we observed brighter stripsat which the width varied between 5 and 7 pixels whereas itvaried between 1 and 3 pixels for darker strips Therefore themidvalue was considered to be four to distinguish betweendark and white patterns When the measured width wasgreater than four smart phone will consider the LED to bein the ON state and store a binary lsquo1rsquo value Otherwise theLED is assumed to be in the OFF state and stores a lsquo0rsquo valueFigure 13 illustrates the android implementation software thatcan simultaneously receive data from four LEDsWe receivedfive bits of ID per LED and defined each combination ofthe received binary ID to a fixed location in the applicationsystem The table in Figure 13 depicts the manner in whichthe location is assigned to different IDs To receive multipleLEDs the overall time for processing one frame should bewithin the critical frame time Otherwise the applicationmayunfortunately be stoppedThe critical frame time is defined asthe maximum processing time for one frame In our systemwe used 20 fps to limit the critical frame time to 50 msec Weadd a starting symbol to recognize when the data is startingSo the time to get the data obviously depends on the timingof getting the frame containing that symbol We test in real-time debugging the software in android studio We observethe timewhich is required for initialization frameprocessingdemodulating and device responseWefigure out that almost

Wireless Communications and Mobile Computing 11

CoordinateBits(11)00000(12)00001(21)00010(22)00011(13)00100

Figure 13 Implemented android application receiving multipleLEDs ID simultaneously

1000 ms is required to debug and getting device responsewhen communicating with 4 LEDs simultaneously As eachLED is transmitting 10 bits we got data at 40 bps rate for 4LEDs with this device

8 Conclusion

This work focuses on the reception of data from multipleindoor LED transmitters as well as on the localization andnavigation of the MR processing the received data Theindoor LED locations are received as binary data of five bitsper link at a data rate of 40 bps The proposed algorithmwas implemented in an indoor platform where four LEDshaving sizes of 71 cm2 were kept at a distance of 50 cmfrom each other The four LEDs transmit four different IDsand an android application that was developed to receivethese IDs performed well simultaneously We were able tocommunicate with the LEDs from a maximum distance of480 cm As our positioning approach requires measurementof the direct distance from the camera to the LED wecompare the accuracy of measuring the distance at differentfloor positions Maintaining the floor height to be constantwe experience an increase in the percentage of error whenthe camera moves a long distance horizontally To mitigatethis error we fix some recorded distance values within theapplication to compareWe have foundmaximum 2 cm errorat 100 cm horizontal distance from the floor position of thetarget LED when the MR moves horizontally toward thatLED Thus the accuracy is improved from the localization

and navigation viewpoint as well as based on the distance forindoor communication

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that they have no conflicts of interest

Acknowledgments

This research was supported by the MSIT (Ministry ofScience and ICT) Korea under the Global IT Talent sup-port program (IITP-2017-0-01806) supervised by the IITP(Institute for Information and Communication Technol-ogy Promotion) and the National Research Foundationof Korea (NRF) grant funded by the Korea government(Performance Enhancement and Resource Management ofVisual MIMO-based Camera Communication Networks)(No NRF-2016R1D1A1A09919091)

References

[1] M Z Chowdhury M T Hossan A Islam and Y M Jang ldquoAComparative Survey of Optical Wireless Technologies Archi-tectures and Applicationsrdquo IEEE Access vol 6 pp 9819ndash98402018

[2] M T Hossan M Z Chowdhury A Islam and Y M JangldquoA Novel Indoor Mobile Localization System Based on Opti-cal Camera Communicationrdquo Wireless Communications andMobile Computing vol 2018 17 pages 2018

[3] A Khalajmehrabadi N Gatsis and D Akopian ldquoMod-ern WLAN Fingerprinting Indoor Positioning Methods andDeployment Challengesrdquo IEEE Communications Surveys ampTutorials vol 19 no 3 pp 1974ndash2002 2017

[4] E Kupershtein M Wax and I Cohen ldquoSingle-site emitterlocalization via multipath fingerprintingrdquo IEEE Transactions onSignal Processing vol 61 no 1 pp 10ndash21 2013

[5] P Luo et al ldquoExperimental demonstration of RGB LED-basedoptical camera communicationsrdquo IEEE Photonics Journal vol 7no 5 pp 1ndash12 2015

[6] R D Roberts ldquoUndersampled frequency shift ON-OFF key-ing (UFSOOK) for camera communications (CamCom)rdquo inProceedings of the 22nd Wireless and Optical CommunicationsConference WOCC 2013 pp 645ndash648 China May 2013

[7] P Luo Z Ghassemlooy H L Minh H Tsai and X TangldquoUndersampled-PAM with subcarrier modulation for cameracommunicationsrdquo in Proceedings of the 20th Opto-Electronicsand Communications Conference (OECC) pp 1ndash3 ShanghaiChina 2015

[8] Y-Y Won S-H Yang D-H Kim and S-K Han ldquoThree-dimensional optical wireless indoor positioning system usinglocation code map based on power distribution of visible lightemitting dioderdquo IET Optoelectronics vol 7 no 3 pp 77ndash832013

[9] C Yang andH-R Shao ldquoWiFi-based indoor positioningrdquo IEEECommunications Magazine vol 53 no 3 pp 150ndash157 2015

12 Wireless Communications and Mobile Computing

[10] R Xu W Chen Y Xu and S Ji ldquoA New Indoor PositioningSystem Architecture Using GPS Signalsrdquo Sensors vol 15 no 5pp 10074ndash10087 2015

[11] Z Zhou M Kavehrad and P Deng ldquoIndoor positioning algo-rithm using light-emitting diode visible light communicationsrdquoOptical Engineering vol 51 no 8 2012

[12] W Zhang M I S Chowdhury and M Kavehrad ldquoAsyn-chronous indoor positioning system based on visible lightcommunicationsrdquo Optical Engineering vol 53 no 4 Article ID045105 2014

[13] S-Y Jung S Hann S Park and C-S Park ldquoOptical wirelessindoor positioning system using light emitting diode ceilinglightsrdquo Microwave and Optical Technology Letters vol 54 no7 pp 1622ndash1626 2012

[14] H-S Kim D-R Kim S-H Yang Y-H Son and S-K HanldquoAn indoor visible light communication positioning systemusing a RF carrier allocation techniquerdquo Journal of LightwaveTechnology vol 31 no 1 pp 134ndash144 2013

[15] K Panta and J Armstrong ldquoIndoor localisation using whiteLEDsrdquo IEEE Electronics Letters vol 48 no 4 pp 228ndash230 2012

[16] A D Cheok and L Yue ldquoA novel light-sensor-based informa-tion transmission system for indoor positioning and naviga-tionrdquo IEEE Transactions on Instrumentation and Measurementvol 60 no 1 pp 290ndash299 2011

[17] M Nakajima and S Haruyama ldquoNew indoor navigation systemfor visually impaired people using visible light communicationrdquoEURASIP Journal onWireless Communications and Networkingvol 2013 article 37 2013

[18] H Huang A Yang L Feng G Ni and P Guo ldquoIndoorPositioning Method Based on Metameric White Light Sourcesand Subpixels on aColor Image Sensorrdquo IEEEPhotonics Journalvol 8 no 6 2016

[19] M S Hossen Y Park and K Kim ldquoPerformance improvementof indoor positioning using light-emitting diodes and an imagesensor for light-emitting diode communicationrdquo Optical Engi-neering vol 54 no 3 pp 1ndash11 2015

[20] M-G Moon S-I Choi J Park and J Y Kim ldquoIndoorpositioning system using LED lights and a dual image sensorrdquoJournal of the Optical Society of Korea vol 19 no 6 pp 586ndash5912015

[21] Y-S Kuo P Pannuto K-J Hsiao and P Dutta ldquoLuxaposeIndoor positioning with mobile phones and visible lightrdquo inProceedings of the 20th ACM Annual International ConferenceonMobile Computing and Networking MobiCom 2014 pp 447ndash458 USA September 2014

[22] P Huynh and M Yoo ldquoVLC-based positioning system for anindoor environment using an image sensor and an accelerome-ter sensorrdquo Sensors vol 16 no 6 2016

[23] H Steendam T QWang and J Armstrong ldquoTheoretical lowerbound for indoor visible light positioning using received signalstrengthmeasurements and an aperture-based receiverrdquo Journalof Lightwave Technology vol 35 no 2 pp 309ndash319 2017

[24] F Yin Y Zhao F Gunnarsson and F Gustafsson ldquoReceived-signal-strength threshold optimization using Gaussian pro-cessesrdquo IEEE Transactions on Signal Processing vol 65 no 8pp 2164ndash2177 2017

[25] X Piao Y Zhang T Li et al ldquoRSS fingerprint based indoorlocalization using sparse representation with spatio-temporalconstraintrdquo Sensors vol 16 no 11 2016

[26] C Feng W S A Au S Valaee and Z Tan ldquoReceived-signal-strength-based indoor positioning using compressive sensingrdquo

IEEETransactions onMobile Computing vol 11 no 12 pp 1983ndash1993 2012

[27] W Gu W Zhang M Kavehrad and L Feng ldquoThree-dimensional light positioning algorithm with filtering tech-niques for indoor environmentsrdquo Optical Engineering vol 53no 10 2014

[28] S-H Yang E-M Jung and S-K Han ldquoIndoor locationestimation based on LED visible light communication usingmultiple optical receiversrdquo IEEE Communications Letters vol17 no 9 pp 1834ndash1837 2013

[29] F I K Mousa H Le-Minh Z Ghassemlooy et al ldquoIndoorlocalization system utilizing two visible light emitting diodesrdquoOptical Engineering vol 55 no 11 2016

[30] K Wang A Nirmalathas C Lim K Alameh and E SkafidasldquoOptical wireless-based indoor localization system employing asingle-channel imaging receiverrdquo Journal of Lightwave Technol-ogy vol 34 no 4 pp 1141ndash1149 2016

[31] S-H Jung B-CMoon andDHan ldquoUnsupervised learning forcrowdsourced indoor localization in wireless networksrdquo IEEETransactions on Mobile Computing vol 15 no 11 pp 2892ndash2906 2016

[32] X Wang L Gao S Mao and S Pandey ldquoCSI-Based Finger-printing for Indoor Localization A Deep Learning ApproachrdquoIEEE Transactions on Vehicular Technology vol 66 no 1 pp763ndash776 2017

[33] Y Hu andG Leus ldquoRobust differential received signal strength-based localizationrdquo IEEE Transactions on Signal Processing vol65 no 12 pp 3261ndash3276 2017

[34] T-H Do and M Yoo ldquoTDOA-based indoor positioning usingvisible lightrdquo Photonic Network Communications vol 27 no 2pp 80ndash88 2014

[35] S-Y Jung S Hann and C-S Park ldquoTDOA-based opticalwireless indoor localization using LED ceiling lampsrdquo IEEETransactions on Consumer Electronics vol 57 no 4 pp 1592ndash1597 2011

[36] U Nadeem N U Hassan M A Pasha and C Yuen ldquoHighlyaccurate 3D wireless indoor positioning system using whiteLED lightsrdquo IEEE Electronics Letters vol 50 no 11 pp 828ndash8302014

[37] TQWang Y A Sekercioglu ANeild and J Armstrong ldquoPosi-tion accuracy of time-of-arrival based ranging using visiblelight with application in indoor localization systemsrdquo Journalof Lightwave Technology vol 31 no 20 pp 3302ndash3308 2013

[38] O Ennasr G Xing and X Tan ldquoDistributed time-difference-of-arrival (TDOA)-based localization of a moving targetrdquo inProceedings of the 55th IEEEConference onDecision andControlCDC 2016 pp 2652ndash2658 USA December 2016

[39] S H Yang E M Jeong and S K Han ldquoIndoor positioningbased on received optical power difference by angle of arrivalrdquoIEEE Electronics Letters vol 50 no 1 pp 49ndash51 2014

[40] A Arafa X Jin and R Klukas ldquoWireless indoor opticalpositioning with a differential photosensorrdquo IEEE PhotonicsTechnology Letters vol 24 no 12 pp 1027ndash1029 2012

[41] M T Hossan M Z Chowdhury M K Hasan et al ldquoA NewVehicle Localization SchemeBased onCombinedOptical Cam-era Communication and PhotogrammetryrdquoMobile InformationSystems vol 2018 Article ID 8501898 14 pages 2018

[42] S M Berman D S Greenhouse I L Bailey R D Clearand T W Raasch ldquoHuman electroretinogram responses tovideo displays fluorescent lighting and other high frequencysourcesrdquo Optometry and Vision Science vol 68 no 8 pp 645ndash662 1991

Wireless Communications and Mobile Computing 13

[43] H-Y Lee H-M Lin Y-L Wei H-I Wu H-M Tsai andK C-J Lin ldquoRollingLight Enabling line-of-sight light-to-camera communicationsrdquo in Proceedings of the 13th AnnualInternational Conference on Mobile Systems Applications andServices MobiSys 2015 pp 167ndash180 Italy May 2015

[44] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000

[45] HAMartins J R Birk andR B Kelley ldquoCameramodels basedon data from two calibration planesrdquo Computer Graphics andImage Processing vol 17 no 2 pp 173ndash180 1981

[46] E Arthurs andH Dym ldquoOn the OptimumDetection of DigitalSignals in the Presence ofWhite Gaussian NoisemdashAGeometricInterpretation and a Study of Three Basic Data TransmissionSystemsrdquo IRE Transactions on Communications Systems vol 10no 4 pp 336ndash372 1962

International Journal of

AerospaceEngineeringHindawiwwwhindawicom Volume 2018

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Active and Passive Electronic Components

VLSI Design

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Shock and Vibration

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawiwwwhindawicom

Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Control Scienceand Engineering

Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

SensorsJournal of

Hindawiwwwhindawicom Volume 2018

International Journal of

RotatingMachinery

Hindawiwwwhindawicom Volume 2018

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Navigation and Observation

International Journal of

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

Submit your manuscripts atwwwhindawicom

Page 8: An Implementation Approach and Performance …...Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md.Shahjalal ,Md.TanvirHossan ,Moh.KhalidHasan

8 Wireless Communications and Mobile Computing

Table 1 Distance measurement data from an LED at different floor positions

Horizontaldistance dh(cm)

Measureddirect distance

(cm)

Actual directdistance (cm)

Error in distancemeasurement

(cm)

Detected area(px)

1st Radius119903 (px)

2nd Radius1199031015840 (px)

00 2555 2555 0 348

11

1110 25588 2559 002 347 1020 25665 2567 005 341 1030 2576 2577 01 337 1040 25885 259 015 336 1050 260 2602 02 330 960 2625 2628 03 324 970 2643 2647 04 314 980 2671 2677 06 306 990 2695 27061 111 298 8100 2725 2745 2 288 8

Normally it is observed that FltltD therefore we cancombine the above equation as follows

119886119894119895 = 1198982119860 119894119895 (8)

119886119894119895 = 11986521198632119860 119894119895 (9)

119863 = 119865radic119860 119894119895119886119894119895 (10)

where the focal length F and the actual LED size 119860 119894119895 areknown The only requirement is to obtain the detected LEDarea 119886119894119895 from the IS

Using (10) the distance can be measured accurately ata position at which the camera is vertically in the LOS ofthe LED But when it navigates through some other placeswhich are not just below the LED can be experienced someerrors during this measurement Because at that time theshape of the circular LED will be seemed like a ellipse tothe camera This happens due to the radial distortion ofthe camera [44] The wide-angle camera short focal-lengthscamera or the fisheye camera are the main reason behindthe radial distortion Determining intrinsic extrinsic anddistortion coefficient of IS and lens of the camera is requiredto omit the effect of radial distortion This radial distortionmakes a significant transformation of the projected objectfrom the actual object with a normal focal length To verifythe localization scheme a back-projection method deploysto measure area of LED from distorted image from read-world space [45] In the following section we will exhibit afigure to illustrate the accuracy of the distance measurementAlthough at the boundary position the accuracy remainsgreater than 98 we have undertaken one further step tomitigate this error Let us take 119903 as the radius of circular LEDand the area will be 1205871199032 Now when circular shape changesto eclipse one radius 119903 reduces to 1199031015840 and the area becomes1205871199031199031015840 Table 1 presents the data measured from a LED witha size of 71 cm2 located at a ceiling of height of 256 m Thepixels allocated for the detected LED at different positions are

975

980

985

990

995

1000

Perc

enta

ge o

f acc

urac

y

10 20 30 40 50 60 70 80 90 1000Horizontal distance (cm)

Figure 9 Percentage of accuracy of the measured direct distancesfrom the camera to the LED at different horizontal points

recorded to show at far distance detected contour becomessmaller than nearer We tabulated the radius of eclipse whichis required to calculate the actual size of eclipse and correctthe positioning error

This radius data is integrated within the application andwill be called when this radius will be measured by smart-phone through image processing Comparing the existingdata with the actual area of the LED can predict to measurethe direct distance

The distance between the LED and the camera changeswhen the MR moves horizontally The direct distance from aparticular LED is required to calculate the horizontal distancethat is required to move and to get the floor location ofthat particular LED We used (10) to measure the directdistance between the camera and the LED We graphicallyrepresented the percentage of accuracy that is observed whilemeasuring the direct distance at a particular point on the floorin Figure 9 The distance is measured with 100 accuracywhen the camera is situated directly below the LEDWhen the

Wireless Communications and Mobile Computing 9

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

20 40 60 80 1000Horizontal distance (cm)

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

(a)

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

20 40 60 80 1000Horizontal distance (cm)

(b)

Figure 10 Distance error comparison in terms of variation of no of LEDs inside the camera FOV at different horizontal points (a) Inter-LEDdistance = 50 cm and (b) inter-LED distance = 100 cm

camera is shifted horizontally far away from the floor locationof the LED the accuracy is reduced

To evaluate the performance of the proposed localizationalgorithm we have tested the accuracy of measuring thedistance for different type of scenarios We have got thedistance measurement error of 2 cm maximum in case ofsingle LED inside the camera FOV However when thereare multiple LEDs inside the camera FOV the distance errorincreases as the probability of LED being at the middle ofthe IS reduces Figure 10 depicts the comparison of errorin measuring the distance between LEDs and camera fordifferent cases varying the number of LEDs into the cameraFOV We have taken two scenarios first is for the case wherethe inter-LED distance is 50 cm and second is for the casewhere it is 100 cm In each scenario we have considered 2to 4 LEDs for the comparison We started measuring fromthe floor position of an LED and took 10 more reading goingaway about 10 cm in each step We can see from the figurethat if there are more LEDs into the IS at the same time theerror formeasuring the distance increasesThis is because thechance of an LED to be detected at the center of IS is beingreduced accordingly As our system comprises LEDs that aredistributed in a rectangular pattern there is more possibilityof being 4 LEDs at a time in the IS And for this case we haveto consider up to 75 cm distance error for positioning andnavigation purpose when the inter-LED distance is 50 cmBut for the second case described in Figure 10(b) the valueof the distance error increases up to 85 cm for 4 LEDs due toincrease of the inter-LED spacing to 100 cm

Though the distance error reducing with the reducednumber of LEDs being allowed into the camera FOV it isrequired to have more than two LEDs into the camera FOVto get the navigation information and we should accept thiserror anyway Also we have described a way in this sectionthat can be applied to reduce the distance error to a certainlevel for moving to a horizontal distance to reach the targetLED location increasing the positioning performance

2-FSK4-FSK8-FSK

2 3 4 5 6 7 8 9 10 111Signal to noise ratio (dB)

10minus5

10minus4

10minus3

10minus2

10minus1

100

Bit-e

rror

rate

Figure 11 BER comparison forM-FSK (M=248) in indoor wirelessRician fading channel

62 Bit Error Rate for MFSK To evaluate bit error rate(BER) performance we considered an indoor wireless Ricianfading channel MFSK is simple and flexible for multilinkcommunication over this channel It is known [46] that thebit error probability for MFSK is

120588119904 = 1119872119872sum119894=2

(minus1)119894 (119872119894 ) exp [minus (1 minus1119894 ) 120588] (11)

where M is the number of orders and 120588is the instantaneoussignal-to-noise ratio (SNR) value In Figure 11 we depicteda BER comparison graph between different modulation

10 Wireless Communications and Mobile Computing

50 cm

50 cm

50 cm

50 cm

$ =71 cG2

(a) (b)

Figure 12 Experimental setup (a) Four LEDs that are 50 cm apart from each other transmit the location IDs (b) Captured image after settingthe exposure time to 18000 sec

orders (M=248) We have seen for increasing M bit errorprobability decreasesWhen the transmission is done keepingthe same bit energy which means that the symbol durationbecomes double for the case of 4-FSK compared to 2-FSKmaintaining the same bit rate meanwhile In such case theBER becomes lower as symbol duration increases with highermodulation index

7 Demonstration Results

We used a platform to demonstrate our OCC system toevaluate its performance To understand the ID distributionsystem we should receive the location information using aminimum of two diagonally located LEDs that are describedin Section 5 The platform comprises four LED transmit-ter links that are placed in a square shape as depictedin Figure 12(a) The LEDs are arranged by maintaining adistance of 50 cm between them For better explanationwe considered these transmitters to be a single unit Thesefour transmitters are linked continuously and transmit fourdifferent locations which are encoded with the help of a PCand an LED driver circuit

To focus only on the target-transmitting LED we con-trolled the camera exposure time The exposure time is theduration at which each pixel is exposed to light Using thisprocess we can obtain the dark and white strips at a regionat which the LEDs are captured by darkening all the partsof the image except the parts that depict the target LEDsFigure 12(b) depicts the captured image using the SamsungS7 (edge) camera of the transmitting LED unit after settingthe camerarsquos CaptureRequestSENSORE EXPOSURE TIMEto 18000 secWe can observe that all the four LEDs representthe same dark and bright strip width This is because samemodulation frequencies were used for all the four transmit-ters and the frequencies ranged from 2 to 4 kHz Generally

the characteristics of the rolling image are such that the stripscan be formed horizontally However in our case we rotatedthe image using imageView at 270∘ and this is the reason forthe observation of vertically formed strips

To demodulate the data from the detected strip patternwe should initially measure the width of each dark and whitestrip We know different camera has different frame rate andalso for a fixed camera frame rate fluctuates within a shortrange For this reason the width also varies and as the widthis measured in a scale of number of pixels in a row it shouldbe an integer value In our system we observed brighter stripsat which the width varied between 5 and 7 pixels whereas itvaried between 1 and 3 pixels for darker strips Therefore themidvalue was considered to be four to distinguish betweendark and white patterns When the measured width wasgreater than four smart phone will consider the LED to bein the ON state and store a binary lsquo1rsquo value Otherwise theLED is assumed to be in the OFF state and stores a lsquo0rsquo valueFigure 13 illustrates the android implementation software thatcan simultaneously receive data from four LEDsWe receivedfive bits of ID per LED and defined each combination ofthe received binary ID to a fixed location in the applicationsystem The table in Figure 13 depicts the manner in whichthe location is assigned to different IDs To receive multipleLEDs the overall time for processing one frame should bewithin the critical frame time Otherwise the applicationmayunfortunately be stoppedThe critical frame time is defined asthe maximum processing time for one frame In our systemwe used 20 fps to limit the critical frame time to 50 msec Weadd a starting symbol to recognize when the data is startingSo the time to get the data obviously depends on the timingof getting the frame containing that symbol We test in real-time debugging the software in android studio We observethe timewhich is required for initialization frameprocessingdemodulating and device responseWefigure out that almost

Wireless Communications and Mobile Computing 11

CoordinateBits(11)00000(12)00001(21)00010(22)00011(13)00100

Figure 13 Implemented android application receiving multipleLEDs ID simultaneously

1000 ms is required to debug and getting device responsewhen communicating with 4 LEDs simultaneously As eachLED is transmitting 10 bits we got data at 40 bps rate for 4LEDs with this device

8 Conclusion

This work focuses on the reception of data from multipleindoor LED transmitters as well as on the localization andnavigation of the MR processing the received data Theindoor LED locations are received as binary data of five bitsper link at a data rate of 40 bps The proposed algorithmwas implemented in an indoor platform where four LEDshaving sizes of 71 cm2 were kept at a distance of 50 cmfrom each other The four LEDs transmit four different IDsand an android application that was developed to receivethese IDs performed well simultaneously We were able tocommunicate with the LEDs from a maximum distance of480 cm As our positioning approach requires measurementof the direct distance from the camera to the LED wecompare the accuracy of measuring the distance at differentfloor positions Maintaining the floor height to be constantwe experience an increase in the percentage of error whenthe camera moves a long distance horizontally To mitigatethis error we fix some recorded distance values within theapplication to compareWe have foundmaximum 2 cm errorat 100 cm horizontal distance from the floor position of thetarget LED when the MR moves horizontally toward thatLED Thus the accuracy is improved from the localization

and navigation viewpoint as well as based on the distance forindoor communication

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that they have no conflicts of interest

Acknowledgments

This research was supported by the MSIT (Ministry ofScience and ICT) Korea under the Global IT Talent sup-port program (IITP-2017-0-01806) supervised by the IITP(Institute for Information and Communication Technol-ogy Promotion) and the National Research Foundationof Korea (NRF) grant funded by the Korea government(Performance Enhancement and Resource Management ofVisual MIMO-based Camera Communication Networks)(No NRF-2016R1D1A1A09919091)

References

[1] M Z Chowdhury M T Hossan A Islam and Y M Jang ldquoAComparative Survey of Optical Wireless Technologies Archi-tectures and Applicationsrdquo IEEE Access vol 6 pp 9819ndash98402018

[2] M T Hossan M Z Chowdhury A Islam and Y M JangldquoA Novel Indoor Mobile Localization System Based on Opti-cal Camera Communicationrdquo Wireless Communications andMobile Computing vol 2018 17 pages 2018

[3] A Khalajmehrabadi N Gatsis and D Akopian ldquoMod-ern WLAN Fingerprinting Indoor Positioning Methods andDeployment Challengesrdquo IEEE Communications Surveys ampTutorials vol 19 no 3 pp 1974ndash2002 2017

[4] E Kupershtein M Wax and I Cohen ldquoSingle-site emitterlocalization via multipath fingerprintingrdquo IEEE Transactions onSignal Processing vol 61 no 1 pp 10ndash21 2013

[5] P Luo et al ldquoExperimental demonstration of RGB LED-basedoptical camera communicationsrdquo IEEE Photonics Journal vol 7no 5 pp 1ndash12 2015

[6] R D Roberts ldquoUndersampled frequency shift ON-OFF key-ing (UFSOOK) for camera communications (CamCom)rdquo inProceedings of the 22nd Wireless and Optical CommunicationsConference WOCC 2013 pp 645ndash648 China May 2013

[7] P Luo Z Ghassemlooy H L Minh H Tsai and X TangldquoUndersampled-PAM with subcarrier modulation for cameracommunicationsrdquo in Proceedings of the 20th Opto-Electronicsand Communications Conference (OECC) pp 1ndash3 ShanghaiChina 2015

[8] Y-Y Won S-H Yang D-H Kim and S-K Han ldquoThree-dimensional optical wireless indoor positioning system usinglocation code map based on power distribution of visible lightemitting dioderdquo IET Optoelectronics vol 7 no 3 pp 77ndash832013

[9] C Yang andH-R Shao ldquoWiFi-based indoor positioningrdquo IEEECommunications Magazine vol 53 no 3 pp 150ndash157 2015

12 Wireless Communications and Mobile Computing

[10] R Xu W Chen Y Xu and S Ji ldquoA New Indoor PositioningSystem Architecture Using GPS Signalsrdquo Sensors vol 15 no 5pp 10074ndash10087 2015

[11] Z Zhou M Kavehrad and P Deng ldquoIndoor positioning algo-rithm using light-emitting diode visible light communicationsrdquoOptical Engineering vol 51 no 8 2012

[12] W Zhang M I S Chowdhury and M Kavehrad ldquoAsyn-chronous indoor positioning system based on visible lightcommunicationsrdquo Optical Engineering vol 53 no 4 Article ID045105 2014

[13] S-Y Jung S Hann S Park and C-S Park ldquoOptical wirelessindoor positioning system using light emitting diode ceilinglightsrdquo Microwave and Optical Technology Letters vol 54 no7 pp 1622ndash1626 2012

[14] H-S Kim D-R Kim S-H Yang Y-H Son and S-K HanldquoAn indoor visible light communication positioning systemusing a RF carrier allocation techniquerdquo Journal of LightwaveTechnology vol 31 no 1 pp 134ndash144 2013

[15] K Panta and J Armstrong ldquoIndoor localisation using whiteLEDsrdquo IEEE Electronics Letters vol 48 no 4 pp 228ndash230 2012

[16] A D Cheok and L Yue ldquoA novel light-sensor-based informa-tion transmission system for indoor positioning and naviga-tionrdquo IEEE Transactions on Instrumentation and Measurementvol 60 no 1 pp 290ndash299 2011

[17] M Nakajima and S Haruyama ldquoNew indoor navigation systemfor visually impaired people using visible light communicationrdquoEURASIP Journal onWireless Communications and Networkingvol 2013 article 37 2013

[18] H Huang A Yang L Feng G Ni and P Guo ldquoIndoorPositioning Method Based on Metameric White Light Sourcesand Subpixels on aColor Image Sensorrdquo IEEEPhotonics Journalvol 8 no 6 2016

[19] M S Hossen Y Park and K Kim ldquoPerformance improvementof indoor positioning using light-emitting diodes and an imagesensor for light-emitting diode communicationrdquo Optical Engi-neering vol 54 no 3 pp 1ndash11 2015

[20] M-G Moon S-I Choi J Park and J Y Kim ldquoIndoorpositioning system using LED lights and a dual image sensorrdquoJournal of the Optical Society of Korea vol 19 no 6 pp 586ndash5912015

[21] Y-S Kuo P Pannuto K-J Hsiao and P Dutta ldquoLuxaposeIndoor positioning with mobile phones and visible lightrdquo inProceedings of the 20th ACM Annual International ConferenceonMobile Computing and Networking MobiCom 2014 pp 447ndash458 USA September 2014

[22] P Huynh and M Yoo ldquoVLC-based positioning system for anindoor environment using an image sensor and an accelerome-ter sensorrdquo Sensors vol 16 no 6 2016

[23] H Steendam T QWang and J Armstrong ldquoTheoretical lowerbound for indoor visible light positioning using received signalstrengthmeasurements and an aperture-based receiverrdquo Journalof Lightwave Technology vol 35 no 2 pp 309ndash319 2017

[24] F Yin Y Zhao F Gunnarsson and F Gustafsson ldquoReceived-signal-strength threshold optimization using Gaussian pro-cessesrdquo IEEE Transactions on Signal Processing vol 65 no 8pp 2164ndash2177 2017

[25] X Piao Y Zhang T Li et al ldquoRSS fingerprint based indoorlocalization using sparse representation with spatio-temporalconstraintrdquo Sensors vol 16 no 11 2016

[26] C Feng W S A Au S Valaee and Z Tan ldquoReceived-signal-strength-based indoor positioning using compressive sensingrdquo

IEEETransactions onMobile Computing vol 11 no 12 pp 1983ndash1993 2012

[27] W Gu W Zhang M Kavehrad and L Feng ldquoThree-dimensional light positioning algorithm with filtering tech-niques for indoor environmentsrdquo Optical Engineering vol 53no 10 2014

[28] S-H Yang E-M Jung and S-K Han ldquoIndoor locationestimation based on LED visible light communication usingmultiple optical receiversrdquo IEEE Communications Letters vol17 no 9 pp 1834ndash1837 2013

[29] F I K Mousa H Le-Minh Z Ghassemlooy et al ldquoIndoorlocalization system utilizing two visible light emitting diodesrdquoOptical Engineering vol 55 no 11 2016

[30] K Wang A Nirmalathas C Lim K Alameh and E SkafidasldquoOptical wireless-based indoor localization system employing asingle-channel imaging receiverrdquo Journal of Lightwave Technol-ogy vol 34 no 4 pp 1141ndash1149 2016

[31] S-H Jung B-CMoon andDHan ldquoUnsupervised learning forcrowdsourced indoor localization in wireless networksrdquo IEEETransactions on Mobile Computing vol 15 no 11 pp 2892ndash2906 2016

[32] X Wang L Gao S Mao and S Pandey ldquoCSI-Based Finger-printing for Indoor Localization A Deep Learning ApproachrdquoIEEE Transactions on Vehicular Technology vol 66 no 1 pp763ndash776 2017

[33] Y Hu andG Leus ldquoRobust differential received signal strength-based localizationrdquo IEEE Transactions on Signal Processing vol65 no 12 pp 3261ndash3276 2017

[34] T-H Do and M Yoo ldquoTDOA-based indoor positioning usingvisible lightrdquo Photonic Network Communications vol 27 no 2pp 80ndash88 2014

[35] S-Y Jung S Hann and C-S Park ldquoTDOA-based opticalwireless indoor localization using LED ceiling lampsrdquo IEEETransactions on Consumer Electronics vol 57 no 4 pp 1592ndash1597 2011

[36] U Nadeem N U Hassan M A Pasha and C Yuen ldquoHighlyaccurate 3D wireless indoor positioning system using whiteLED lightsrdquo IEEE Electronics Letters vol 50 no 11 pp 828ndash8302014

[37] TQWang Y A Sekercioglu ANeild and J Armstrong ldquoPosi-tion accuracy of time-of-arrival based ranging using visiblelight with application in indoor localization systemsrdquo Journalof Lightwave Technology vol 31 no 20 pp 3302ndash3308 2013

[38] O Ennasr G Xing and X Tan ldquoDistributed time-difference-of-arrival (TDOA)-based localization of a moving targetrdquo inProceedings of the 55th IEEEConference onDecision andControlCDC 2016 pp 2652ndash2658 USA December 2016

[39] S H Yang E M Jeong and S K Han ldquoIndoor positioningbased on received optical power difference by angle of arrivalrdquoIEEE Electronics Letters vol 50 no 1 pp 49ndash51 2014

[40] A Arafa X Jin and R Klukas ldquoWireless indoor opticalpositioning with a differential photosensorrdquo IEEE PhotonicsTechnology Letters vol 24 no 12 pp 1027ndash1029 2012

[41] M T Hossan M Z Chowdhury M K Hasan et al ldquoA NewVehicle Localization SchemeBased onCombinedOptical Cam-era Communication and PhotogrammetryrdquoMobile InformationSystems vol 2018 Article ID 8501898 14 pages 2018

[42] S M Berman D S Greenhouse I L Bailey R D Clearand T W Raasch ldquoHuman electroretinogram responses tovideo displays fluorescent lighting and other high frequencysourcesrdquo Optometry and Vision Science vol 68 no 8 pp 645ndash662 1991

Wireless Communications and Mobile Computing 13

[43] H-Y Lee H-M Lin Y-L Wei H-I Wu H-M Tsai andK C-J Lin ldquoRollingLight Enabling line-of-sight light-to-camera communicationsrdquo in Proceedings of the 13th AnnualInternational Conference on Mobile Systems Applications andServices MobiSys 2015 pp 167ndash180 Italy May 2015

[44] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000

[45] HAMartins J R Birk andR B Kelley ldquoCameramodels basedon data from two calibration planesrdquo Computer Graphics andImage Processing vol 17 no 2 pp 173ndash180 1981

[46] E Arthurs andH Dym ldquoOn the OptimumDetection of DigitalSignals in the Presence ofWhite Gaussian NoisemdashAGeometricInterpretation and a Study of Three Basic Data TransmissionSystemsrdquo IRE Transactions on Communications Systems vol 10no 4 pp 336ndash372 1962

International Journal of

AerospaceEngineeringHindawiwwwhindawicom Volume 2018

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Active and Passive Electronic Components

VLSI Design

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Shock and Vibration

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawiwwwhindawicom

Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Control Scienceand Engineering

Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

SensorsJournal of

Hindawiwwwhindawicom Volume 2018

International Journal of

RotatingMachinery

Hindawiwwwhindawicom Volume 2018

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Navigation and Observation

International Journal of

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

Submit your manuscripts atwwwhindawicom

Page 9: An Implementation Approach and Performance …...Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md.Shahjalal ,Md.TanvirHossan ,Moh.KhalidHasan

Wireless Communications and Mobile Computing 9

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

20 40 60 80 1000Horizontal distance (cm)

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

(a)

2 LED in camera FOV3 LED in camera FOV4 LED in camera FOV

0

1

2

3

4

5

6

7

8

9

10

Dist

ance

erro

r (cm

)

20 40 60 80 1000Horizontal distance (cm)

(b)

Figure 10 Distance error comparison in terms of variation of no of LEDs inside the camera FOV at different horizontal points (a) Inter-LEDdistance = 50 cm and (b) inter-LED distance = 100 cm

camera is shifted horizontally far away from the floor locationof the LED the accuracy is reduced

To evaluate the performance of the proposed localizationalgorithm we have tested the accuracy of measuring thedistance for different type of scenarios We have got thedistance measurement error of 2 cm maximum in case ofsingle LED inside the camera FOV However when thereare multiple LEDs inside the camera FOV the distance errorincreases as the probability of LED being at the middle ofthe IS reduces Figure 10 depicts the comparison of errorin measuring the distance between LEDs and camera fordifferent cases varying the number of LEDs into the cameraFOV We have taken two scenarios first is for the case wherethe inter-LED distance is 50 cm and second is for the casewhere it is 100 cm In each scenario we have considered 2to 4 LEDs for the comparison We started measuring fromthe floor position of an LED and took 10 more reading goingaway about 10 cm in each step We can see from the figurethat if there are more LEDs into the IS at the same time theerror formeasuring the distance increasesThis is because thechance of an LED to be detected at the center of IS is beingreduced accordingly As our system comprises LEDs that aredistributed in a rectangular pattern there is more possibilityof being 4 LEDs at a time in the IS And for this case we haveto consider up to 75 cm distance error for positioning andnavigation purpose when the inter-LED distance is 50 cmBut for the second case described in Figure 10(b) the valueof the distance error increases up to 85 cm for 4 LEDs due toincrease of the inter-LED spacing to 100 cm

Though the distance error reducing with the reducednumber of LEDs being allowed into the camera FOV it isrequired to have more than two LEDs into the camera FOVto get the navigation information and we should accept thiserror anyway Also we have described a way in this sectionthat can be applied to reduce the distance error to a certainlevel for moving to a horizontal distance to reach the targetLED location increasing the positioning performance

2-FSK4-FSK8-FSK

2 3 4 5 6 7 8 9 10 111Signal to noise ratio (dB)

10minus5

10minus4

10minus3

10minus2

10minus1

100

Bit-e

rror

rate

Figure 11 BER comparison forM-FSK (M=248) in indoor wirelessRician fading channel

62 Bit Error Rate for MFSK To evaluate bit error rate(BER) performance we considered an indoor wireless Ricianfading channel MFSK is simple and flexible for multilinkcommunication over this channel It is known [46] that thebit error probability for MFSK is

120588119904 = 1119872119872sum119894=2

(minus1)119894 (119872119894 ) exp [minus (1 minus1119894 ) 120588] (11)

where M is the number of orders and 120588is the instantaneoussignal-to-noise ratio (SNR) value In Figure 11 we depicteda BER comparison graph between different modulation

10 Wireless Communications and Mobile Computing

50 cm

50 cm

50 cm

50 cm

$ =71 cG2

(a) (b)

Figure 12 Experimental setup (a) Four LEDs that are 50 cm apart from each other transmit the location IDs (b) Captured image after settingthe exposure time to 18000 sec

orders (M=248) We have seen for increasing M bit errorprobability decreasesWhen the transmission is done keepingthe same bit energy which means that the symbol durationbecomes double for the case of 4-FSK compared to 2-FSKmaintaining the same bit rate meanwhile In such case theBER becomes lower as symbol duration increases with highermodulation index

7 Demonstration Results

We used a platform to demonstrate our OCC system toevaluate its performance To understand the ID distributionsystem we should receive the location information using aminimum of two diagonally located LEDs that are describedin Section 5 The platform comprises four LED transmit-ter links that are placed in a square shape as depictedin Figure 12(a) The LEDs are arranged by maintaining adistance of 50 cm between them For better explanationwe considered these transmitters to be a single unit Thesefour transmitters are linked continuously and transmit fourdifferent locations which are encoded with the help of a PCand an LED driver circuit

To focus only on the target-transmitting LED we con-trolled the camera exposure time The exposure time is theduration at which each pixel is exposed to light Using thisprocess we can obtain the dark and white strips at a regionat which the LEDs are captured by darkening all the partsof the image except the parts that depict the target LEDsFigure 12(b) depicts the captured image using the SamsungS7 (edge) camera of the transmitting LED unit after settingthe camerarsquos CaptureRequestSENSORE EXPOSURE TIMEto 18000 secWe can observe that all the four LEDs representthe same dark and bright strip width This is because samemodulation frequencies were used for all the four transmit-ters and the frequencies ranged from 2 to 4 kHz Generally

the characteristics of the rolling image are such that the stripscan be formed horizontally However in our case we rotatedthe image using imageView at 270∘ and this is the reason forthe observation of vertically formed strips

To demodulate the data from the detected strip patternwe should initially measure the width of each dark and whitestrip We know different camera has different frame rate andalso for a fixed camera frame rate fluctuates within a shortrange For this reason the width also varies and as the widthis measured in a scale of number of pixels in a row it shouldbe an integer value In our system we observed brighter stripsat which the width varied between 5 and 7 pixels whereas itvaried between 1 and 3 pixels for darker strips Therefore themidvalue was considered to be four to distinguish betweendark and white patterns When the measured width wasgreater than four smart phone will consider the LED to bein the ON state and store a binary lsquo1rsquo value Otherwise theLED is assumed to be in the OFF state and stores a lsquo0rsquo valueFigure 13 illustrates the android implementation software thatcan simultaneously receive data from four LEDsWe receivedfive bits of ID per LED and defined each combination ofthe received binary ID to a fixed location in the applicationsystem The table in Figure 13 depicts the manner in whichthe location is assigned to different IDs To receive multipleLEDs the overall time for processing one frame should bewithin the critical frame time Otherwise the applicationmayunfortunately be stoppedThe critical frame time is defined asthe maximum processing time for one frame In our systemwe used 20 fps to limit the critical frame time to 50 msec Weadd a starting symbol to recognize when the data is startingSo the time to get the data obviously depends on the timingof getting the frame containing that symbol We test in real-time debugging the software in android studio We observethe timewhich is required for initialization frameprocessingdemodulating and device responseWefigure out that almost

Wireless Communications and Mobile Computing 11

CoordinateBits(11)00000(12)00001(21)00010(22)00011(13)00100

Figure 13 Implemented android application receiving multipleLEDs ID simultaneously

1000 ms is required to debug and getting device responsewhen communicating with 4 LEDs simultaneously As eachLED is transmitting 10 bits we got data at 40 bps rate for 4LEDs with this device

8 Conclusion

This work focuses on the reception of data from multipleindoor LED transmitters as well as on the localization andnavigation of the MR processing the received data Theindoor LED locations are received as binary data of five bitsper link at a data rate of 40 bps The proposed algorithmwas implemented in an indoor platform where four LEDshaving sizes of 71 cm2 were kept at a distance of 50 cmfrom each other The four LEDs transmit four different IDsand an android application that was developed to receivethese IDs performed well simultaneously We were able tocommunicate with the LEDs from a maximum distance of480 cm As our positioning approach requires measurementof the direct distance from the camera to the LED wecompare the accuracy of measuring the distance at differentfloor positions Maintaining the floor height to be constantwe experience an increase in the percentage of error whenthe camera moves a long distance horizontally To mitigatethis error we fix some recorded distance values within theapplication to compareWe have foundmaximum 2 cm errorat 100 cm horizontal distance from the floor position of thetarget LED when the MR moves horizontally toward thatLED Thus the accuracy is improved from the localization

and navigation viewpoint as well as based on the distance forindoor communication

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that they have no conflicts of interest

Acknowledgments

This research was supported by the MSIT (Ministry ofScience and ICT) Korea under the Global IT Talent sup-port program (IITP-2017-0-01806) supervised by the IITP(Institute for Information and Communication Technol-ogy Promotion) and the National Research Foundationof Korea (NRF) grant funded by the Korea government(Performance Enhancement and Resource Management ofVisual MIMO-based Camera Communication Networks)(No NRF-2016R1D1A1A09919091)

References

[1] M Z Chowdhury M T Hossan A Islam and Y M Jang ldquoAComparative Survey of Optical Wireless Technologies Archi-tectures and Applicationsrdquo IEEE Access vol 6 pp 9819ndash98402018

[2] M T Hossan M Z Chowdhury A Islam and Y M JangldquoA Novel Indoor Mobile Localization System Based on Opti-cal Camera Communicationrdquo Wireless Communications andMobile Computing vol 2018 17 pages 2018

[3] A Khalajmehrabadi N Gatsis and D Akopian ldquoMod-ern WLAN Fingerprinting Indoor Positioning Methods andDeployment Challengesrdquo IEEE Communications Surveys ampTutorials vol 19 no 3 pp 1974ndash2002 2017

[4] E Kupershtein M Wax and I Cohen ldquoSingle-site emitterlocalization via multipath fingerprintingrdquo IEEE Transactions onSignal Processing vol 61 no 1 pp 10ndash21 2013

[5] P Luo et al ldquoExperimental demonstration of RGB LED-basedoptical camera communicationsrdquo IEEE Photonics Journal vol 7no 5 pp 1ndash12 2015

[6] R D Roberts ldquoUndersampled frequency shift ON-OFF key-ing (UFSOOK) for camera communications (CamCom)rdquo inProceedings of the 22nd Wireless and Optical CommunicationsConference WOCC 2013 pp 645ndash648 China May 2013

[7] P Luo Z Ghassemlooy H L Minh H Tsai and X TangldquoUndersampled-PAM with subcarrier modulation for cameracommunicationsrdquo in Proceedings of the 20th Opto-Electronicsand Communications Conference (OECC) pp 1ndash3 ShanghaiChina 2015

[8] Y-Y Won S-H Yang D-H Kim and S-K Han ldquoThree-dimensional optical wireless indoor positioning system usinglocation code map based on power distribution of visible lightemitting dioderdquo IET Optoelectronics vol 7 no 3 pp 77ndash832013

[9] C Yang andH-R Shao ldquoWiFi-based indoor positioningrdquo IEEECommunications Magazine vol 53 no 3 pp 150ndash157 2015

12 Wireless Communications and Mobile Computing

[10] R Xu W Chen Y Xu and S Ji ldquoA New Indoor PositioningSystem Architecture Using GPS Signalsrdquo Sensors vol 15 no 5pp 10074ndash10087 2015

[11] Z Zhou M Kavehrad and P Deng ldquoIndoor positioning algo-rithm using light-emitting diode visible light communicationsrdquoOptical Engineering vol 51 no 8 2012

[12] W Zhang M I S Chowdhury and M Kavehrad ldquoAsyn-chronous indoor positioning system based on visible lightcommunicationsrdquo Optical Engineering vol 53 no 4 Article ID045105 2014

[13] S-Y Jung S Hann S Park and C-S Park ldquoOptical wirelessindoor positioning system using light emitting diode ceilinglightsrdquo Microwave and Optical Technology Letters vol 54 no7 pp 1622ndash1626 2012

[14] H-S Kim D-R Kim S-H Yang Y-H Son and S-K HanldquoAn indoor visible light communication positioning systemusing a RF carrier allocation techniquerdquo Journal of LightwaveTechnology vol 31 no 1 pp 134ndash144 2013

[15] K Panta and J Armstrong ldquoIndoor localisation using whiteLEDsrdquo IEEE Electronics Letters vol 48 no 4 pp 228ndash230 2012

[16] A D Cheok and L Yue ldquoA novel light-sensor-based informa-tion transmission system for indoor positioning and naviga-tionrdquo IEEE Transactions on Instrumentation and Measurementvol 60 no 1 pp 290ndash299 2011

[17] M Nakajima and S Haruyama ldquoNew indoor navigation systemfor visually impaired people using visible light communicationrdquoEURASIP Journal onWireless Communications and Networkingvol 2013 article 37 2013

[18] H Huang A Yang L Feng G Ni and P Guo ldquoIndoorPositioning Method Based on Metameric White Light Sourcesand Subpixels on aColor Image Sensorrdquo IEEEPhotonics Journalvol 8 no 6 2016

[19] M S Hossen Y Park and K Kim ldquoPerformance improvementof indoor positioning using light-emitting diodes and an imagesensor for light-emitting diode communicationrdquo Optical Engi-neering vol 54 no 3 pp 1ndash11 2015

[20] M-G Moon S-I Choi J Park and J Y Kim ldquoIndoorpositioning system using LED lights and a dual image sensorrdquoJournal of the Optical Society of Korea vol 19 no 6 pp 586ndash5912015

[21] Y-S Kuo P Pannuto K-J Hsiao and P Dutta ldquoLuxaposeIndoor positioning with mobile phones and visible lightrdquo inProceedings of the 20th ACM Annual International ConferenceonMobile Computing and Networking MobiCom 2014 pp 447ndash458 USA September 2014

[22] P Huynh and M Yoo ldquoVLC-based positioning system for anindoor environment using an image sensor and an accelerome-ter sensorrdquo Sensors vol 16 no 6 2016

[23] H Steendam T QWang and J Armstrong ldquoTheoretical lowerbound for indoor visible light positioning using received signalstrengthmeasurements and an aperture-based receiverrdquo Journalof Lightwave Technology vol 35 no 2 pp 309ndash319 2017

[24] F Yin Y Zhao F Gunnarsson and F Gustafsson ldquoReceived-signal-strength threshold optimization using Gaussian pro-cessesrdquo IEEE Transactions on Signal Processing vol 65 no 8pp 2164ndash2177 2017

[25] X Piao Y Zhang T Li et al ldquoRSS fingerprint based indoorlocalization using sparse representation with spatio-temporalconstraintrdquo Sensors vol 16 no 11 2016

[26] C Feng W S A Au S Valaee and Z Tan ldquoReceived-signal-strength-based indoor positioning using compressive sensingrdquo

IEEETransactions onMobile Computing vol 11 no 12 pp 1983ndash1993 2012

[27] W Gu W Zhang M Kavehrad and L Feng ldquoThree-dimensional light positioning algorithm with filtering tech-niques for indoor environmentsrdquo Optical Engineering vol 53no 10 2014

[28] S-H Yang E-M Jung and S-K Han ldquoIndoor locationestimation based on LED visible light communication usingmultiple optical receiversrdquo IEEE Communications Letters vol17 no 9 pp 1834ndash1837 2013

[29] F I K Mousa H Le-Minh Z Ghassemlooy et al ldquoIndoorlocalization system utilizing two visible light emitting diodesrdquoOptical Engineering vol 55 no 11 2016

[30] K Wang A Nirmalathas C Lim K Alameh and E SkafidasldquoOptical wireless-based indoor localization system employing asingle-channel imaging receiverrdquo Journal of Lightwave Technol-ogy vol 34 no 4 pp 1141ndash1149 2016

[31] S-H Jung B-CMoon andDHan ldquoUnsupervised learning forcrowdsourced indoor localization in wireless networksrdquo IEEETransactions on Mobile Computing vol 15 no 11 pp 2892ndash2906 2016

[32] X Wang L Gao S Mao and S Pandey ldquoCSI-Based Finger-printing for Indoor Localization A Deep Learning ApproachrdquoIEEE Transactions on Vehicular Technology vol 66 no 1 pp763ndash776 2017

[33] Y Hu andG Leus ldquoRobust differential received signal strength-based localizationrdquo IEEE Transactions on Signal Processing vol65 no 12 pp 3261ndash3276 2017

[34] T-H Do and M Yoo ldquoTDOA-based indoor positioning usingvisible lightrdquo Photonic Network Communications vol 27 no 2pp 80ndash88 2014

[35] S-Y Jung S Hann and C-S Park ldquoTDOA-based opticalwireless indoor localization using LED ceiling lampsrdquo IEEETransactions on Consumer Electronics vol 57 no 4 pp 1592ndash1597 2011

[36] U Nadeem N U Hassan M A Pasha and C Yuen ldquoHighlyaccurate 3D wireless indoor positioning system using whiteLED lightsrdquo IEEE Electronics Letters vol 50 no 11 pp 828ndash8302014

[37] TQWang Y A Sekercioglu ANeild and J Armstrong ldquoPosi-tion accuracy of time-of-arrival based ranging using visiblelight with application in indoor localization systemsrdquo Journalof Lightwave Technology vol 31 no 20 pp 3302ndash3308 2013

[38] O Ennasr G Xing and X Tan ldquoDistributed time-difference-of-arrival (TDOA)-based localization of a moving targetrdquo inProceedings of the 55th IEEEConference onDecision andControlCDC 2016 pp 2652ndash2658 USA December 2016

[39] S H Yang E M Jeong and S K Han ldquoIndoor positioningbased on received optical power difference by angle of arrivalrdquoIEEE Electronics Letters vol 50 no 1 pp 49ndash51 2014

[40] A Arafa X Jin and R Klukas ldquoWireless indoor opticalpositioning with a differential photosensorrdquo IEEE PhotonicsTechnology Letters vol 24 no 12 pp 1027ndash1029 2012

[41] M T Hossan M Z Chowdhury M K Hasan et al ldquoA NewVehicle Localization SchemeBased onCombinedOptical Cam-era Communication and PhotogrammetryrdquoMobile InformationSystems vol 2018 Article ID 8501898 14 pages 2018

[42] S M Berman D S Greenhouse I L Bailey R D Clearand T W Raasch ldquoHuman electroretinogram responses tovideo displays fluorescent lighting and other high frequencysourcesrdquo Optometry and Vision Science vol 68 no 8 pp 645ndash662 1991

Wireless Communications and Mobile Computing 13

[43] H-Y Lee H-M Lin Y-L Wei H-I Wu H-M Tsai andK C-J Lin ldquoRollingLight Enabling line-of-sight light-to-camera communicationsrdquo in Proceedings of the 13th AnnualInternational Conference on Mobile Systems Applications andServices MobiSys 2015 pp 167ndash180 Italy May 2015

[44] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000

[45] HAMartins J R Birk andR B Kelley ldquoCameramodels basedon data from two calibration planesrdquo Computer Graphics andImage Processing vol 17 no 2 pp 173ndash180 1981

[46] E Arthurs andH Dym ldquoOn the OptimumDetection of DigitalSignals in the Presence ofWhite Gaussian NoisemdashAGeometricInterpretation and a Study of Three Basic Data TransmissionSystemsrdquo IRE Transactions on Communications Systems vol 10no 4 pp 336ndash372 1962

International Journal of

AerospaceEngineeringHindawiwwwhindawicom Volume 2018

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Active and Passive Electronic Components

VLSI Design

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Shock and Vibration

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawiwwwhindawicom

Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Control Scienceand Engineering

Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

SensorsJournal of

Hindawiwwwhindawicom Volume 2018

International Journal of

RotatingMachinery

Hindawiwwwhindawicom Volume 2018

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Navigation and Observation

International Journal of

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

Submit your manuscripts atwwwhindawicom

Page 10: An Implementation Approach and Performance …...Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md.Shahjalal ,Md.TanvirHossan ,Moh.KhalidHasan

10 Wireless Communications and Mobile Computing

50 cm

50 cm

50 cm

50 cm

$ =71 cG2

(a) (b)

Figure 12 Experimental setup (a) Four LEDs that are 50 cm apart from each other transmit the location IDs (b) Captured image after settingthe exposure time to 18000 sec

orders (M=248) We have seen for increasing M bit errorprobability decreasesWhen the transmission is done keepingthe same bit energy which means that the symbol durationbecomes double for the case of 4-FSK compared to 2-FSKmaintaining the same bit rate meanwhile In such case theBER becomes lower as symbol duration increases with highermodulation index

7 Demonstration Results

We used a platform to demonstrate our OCC system toevaluate its performance To understand the ID distributionsystem we should receive the location information using aminimum of two diagonally located LEDs that are describedin Section 5 The platform comprises four LED transmit-ter links that are placed in a square shape as depictedin Figure 12(a) The LEDs are arranged by maintaining adistance of 50 cm between them For better explanationwe considered these transmitters to be a single unit Thesefour transmitters are linked continuously and transmit fourdifferent locations which are encoded with the help of a PCand an LED driver circuit

To focus only on the target-transmitting LED we con-trolled the camera exposure time The exposure time is theduration at which each pixel is exposed to light Using thisprocess we can obtain the dark and white strips at a regionat which the LEDs are captured by darkening all the partsof the image except the parts that depict the target LEDsFigure 12(b) depicts the captured image using the SamsungS7 (edge) camera of the transmitting LED unit after settingthe camerarsquos CaptureRequestSENSORE EXPOSURE TIMEto 18000 secWe can observe that all the four LEDs representthe same dark and bright strip width This is because samemodulation frequencies were used for all the four transmit-ters and the frequencies ranged from 2 to 4 kHz Generally

the characteristics of the rolling image are such that the stripscan be formed horizontally However in our case we rotatedthe image using imageView at 270∘ and this is the reason forthe observation of vertically formed strips

To demodulate the data from the detected strip patternwe should initially measure the width of each dark and whitestrip We know different camera has different frame rate andalso for a fixed camera frame rate fluctuates within a shortrange For this reason the width also varies and as the widthis measured in a scale of number of pixels in a row it shouldbe an integer value In our system we observed brighter stripsat which the width varied between 5 and 7 pixels whereas itvaried between 1 and 3 pixels for darker strips Therefore themidvalue was considered to be four to distinguish betweendark and white patterns When the measured width wasgreater than four smart phone will consider the LED to bein the ON state and store a binary lsquo1rsquo value Otherwise theLED is assumed to be in the OFF state and stores a lsquo0rsquo valueFigure 13 illustrates the android implementation software thatcan simultaneously receive data from four LEDsWe receivedfive bits of ID per LED and defined each combination ofthe received binary ID to a fixed location in the applicationsystem The table in Figure 13 depicts the manner in whichthe location is assigned to different IDs To receive multipleLEDs the overall time for processing one frame should bewithin the critical frame time Otherwise the applicationmayunfortunately be stoppedThe critical frame time is defined asthe maximum processing time for one frame In our systemwe used 20 fps to limit the critical frame time to 50 msec Weadd a starting symbol to recognize when the data is startingSo the time to get the data obviously depends on the timingof getting the frame containing that symbol We test in real-time debugging the software in android studio We observethe timewhich is required for initialization frameprocessingdemodulating and device responseWefigure out that almost

Wireless Communications and Mobile Computing 11

CoordinateBits(11)00000(12)00001(21)00010(22)00011(13)00100

Figure 13 Implemented android application receiving multipleLEDs ID simultaneously

1000 ms is required to debug and getting device responsewhen communicating with 4 LEDs simultaneously As eachLED is transmitting 10 bits we got data at 40 bps rate for 4LEDs with this device

8 Conclusion

This work focuses on the reception of data from multipleindoor LED transmitters as well as on the localization andnavigation of the MR processing the received data Theindoor LED locations are received as binary data of five bitsper link at a data rate of 40 bps The proposed algorithmwas implemented in an indoor platform where four LEDshaving sizes of 71 cm2 were kept at a distance of 50 cmfrom each other The four LEDs transmit four different IDsand an android application that was developed to receivethese IDs performed well simultaneously We were able tocommunicate with the LEDs from a maximum distance of480 cm As our positioning approach requires measurementof the direct distance from the camera to the LED wecompare the accuracy of measuring the distance at differentfloor positions Maintaining the floor height to be constantwe experience an increase in the percentage of error whenthe camera moves a long distance horizontally To mitigatethis error we fix some recorded distance values within theapplication to compareWe have foundmaximum 2 cm errorat 100 cm horizontal distance from the floor position of thetarget LED when the MR moves horizontally toward thatLED Thus the accuracy is improved from the localization

and navigation viewpoint as well as based on the distance forindoor communication

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that they have no conflicts of interest

Acknowledgments

This research was supported by the MSIT (Ministry ofScience and ICT) Korea under the Global IT Talent sup-port program (IITP-2017-0-01806) supervised by the IITP(Institute for Information and Communication Technol-ogy Promotion) and the National Research Foundationof Korea (NRF) grant funded by the Korea government(Performance Enhancement and Resource Management ofVisual MIMO-based Camera Communication Networks)(No NRF-2016R1D1A1A09919091)

References

[1] M Z Chowdhury M T Hossan A Islam and Y M Jang ldquoAComparative Survey of Optical Wireless Technologies Archi-tectures and Applicationsrdquo IEEE Access vol 6 pp 9819ndash98402018

[2] M T Hossan M Z Chowdhury A Islam and Y M JangldquoA Novel Indoor Mobile Localization System Based on Opti-cal Camera Communicationrdquo Wireless Communications andMobile Computing vol 2018 17 pages 2018

[3] A Khalajmehrabadi N Gatsis and D Akopian ldquoMod-ern WLAN Fingerprinting Indoor Positioning Methods andDeployment Challengesrdquo IEEE Communications Surveys ampTutorials vol 19 no 3 pp 1974ndash2002 2017

[4] E Kupershtein M Wax and I Cohen ldquoSingle-site emitterlocalization via multipath fingerprintingrdquo IEEE Transactions onSignal Processing vol 61 no 1 pp 10ndash21 2013

[5] P Luo et al ldquoExperimental demonstration of RGB LED-basedoptical camera communicationsrdquo IEEE Photonics Journal vol 7no 5 pp 1ndash12 2015

[6] R D Roberts ldquoUndersampled frequency shift ON-OFF key-ing (UFSOOK) for camera communications (CamCom)rdquo inProceedings of the 22nd Wireless and Optical CommunicationsConference WOCC 2013 pp 645ndash648 China May 2013

[7] P Luo Z Ghassemlooy H L Minh H Tsai and X TangldquoUndersampled-PAM with subcarrier modulation for cameracommunicationsrdquo in Proceedings of the 20th Opto-Electronicsand Communications Conference (OECC) pp 1ndash3 ShanghaiChina 2015

[8] Y-Y Won S-H Yang D-H Kim and S-K Han ldquoThree-dimensional optical wireless indoor positioning system usinglocation code map based on power distribution of visible lightemitting dioderdquo IET Optoelectronics vol 7 no 3 pp 77ndash832013

[9] C Yang andH-R Shao ldquoWiFi-based indoor positioningrdquo IEEECommunications Magazine vol 53 no 3 pp 150ndash157 2015

12 Wireless Communications and Mobile Computing

[10] R Xu W Chen Y Xu and S Ji ldquoA New Indoor PositioningSystem Architecture Using GPS Signalsrdquo Sensors vol 15 no 5pp 10074ndash10087 2015

[11] Z Zhou M Kavehrad and P Deng ldquoIndoor positioning algo-rithm using light-emitting diode visible light communicationsrdquoOptical Engineering vol 51 no 8 2012

[12] W Zhang M I S Chowdhury and M Kavehrad ldquoAsyn-chronous indoor positioning system based on visible lightcommunicationsrdquo Optical Engineering vol 53 no 4 Article ID045105 2014

[13] S-Y Jung S Hann S Park and C-S Park ldquoOptical wirelessindoor positioning system using light emitting diode ceilinglightsrdquo Microwave and Optical Technology Letters vol 54 no7 pp 1622ndash1626 2012

[14] H-S Kim D-R Kim S-H Yang Y-H Son and S-K HanldquoAn indoor visible light communication positioning systemusing a RF carrier allocation techniquerdquo Journal of LightwaveTechnology vol 31 no 1 pp 134ndash144 2013

[15] K Panta and J Armstrong ldquoIndoor localisation using whiteLEDsrdquo IEEE Electronics Letters vol 48 no 4 pp 228ndash230 2012

[16] A D Cheok and L Yue ldquoA novel light-sensor-based informa-tion transmission system for indoor positioning and naviga-tionrdquo IEEE Transactions on Instrumentation and Measurementvol 60 no 1 pp 290ndash299 2011

[17] M Nakajima and S Haruyama ldquoNew indoor navigation systemfor visually impaired people using visible light communicationrdquoEURASIP Journal onWireless Communications and Networkingvol 2013 article 37 2013

[18] H Huang A Yang L Feng G Ni and P Guo ldquoIndoorPositioning Method Based on Metameric White Light Sourcesand Subpixels on aColor Image Sensorrdquo IEEEPhotonics Journalvol 8 no 6 2016

[19] M S Hossen Y Park and K Kim ldquoPerformance improvementof indoor positioning using light-emitting diodes and an imagesensor for light-emitting diode communicationrdquo Optical Engi-neering vol 54 no 3 pp 1ndash11 2015

[20] M-G Moon S-I Choi J Park and J Y Kim ldquoIndoorpositioning system using LED lights and a dual image sensorrdquoJournal of the Optical Society of Korea vol 19 no 6 pp 586ndash5912015

[21] Y-S Kuo P Pannuto K-J Hsiao and P Dutta ldquoLuxaposeIndoor positioning with mobile phones and visible lightrdquo inProceedings of the 20th ACM Annual International ConferenceonMobile Computing and Networking MobiCom 2014 pp 447ndash458 USA September 2014

[22] P Huynh and M Yoo ldquoVLC-based positioning system for anindoor environment using an image sensor and an accelerome-ter sensorrdquo Sensors vol 16 no 6 2016

[23] H Steendam T QWang and J Armstrong ldquoTheoretical lowerbound for indoor visible light positioning using received signalstrengthmeasurements and an aperture-based receiverrdquo Journalof Lightwave Technology vol 35 no 2 pp 309ndash319 2017

[24] F Yin Y Zhao F Gunnarsson and F Gustafsson ldquoReceived-signal-strength threshold optimization using Gaussian pro-cessesrdquo IEEE Transactions on Signal Processing vol 65 no 8pp 2164ndash2177 2017

[25] X Piao Y Zhang T Li et al ldquoRSS fingerprint based indoorlocalization using sparse representation with spatio-temporalconstraintrdquo Sensors vol 16 no 11 2016

[26] C Feng W S A Au S Valaee and Z Tan ldquoReceived-signal-strength-based indoor positioning using compressive sensingrdquo

IEEETransactions onMobile Computing vol 11 no 12 pp 1983ndash1993 2012

[27] W Gu W Zhang M Kavehrad and L Feng ldquoThree-dimensional light positioning algorithm with filtering tech-niques for indoor environmentsrdquo Optical Engineering vol 53no 10 2014

[28] S-H Yang E-M Jung and S-K Han ldquoIndoor locationestimation based on LED visible light communication usingmultiple optical receiversrdquo IEEE Communications Letters vol17 no 9 pp 1834ndash1837 2013

[29] F I K Mousa H Le-Minh Z Ghassemlooy et al ldquoIndoorlocalization system utilizing two visible light emitting diodesrdquoOptical Engineering vol 55 no 11 2016

[30] K Wang A Nirmalathas C Lim K Alameh and E SkafidasldquoOptical wireless-based indoor localization system employing asingle-channel imaging receiverrdquo Journal of Lightwave Technol-ogy vol 34 no 4 pp 1141ndash1149 2016

[31] S-H Jung B-CMoon andDHan ldquoUnsupervised learning forcrowdsourced indoor localization in wireless networksrdquo IEEETransactions on Mobile Computing vol 15 no 11 pp 2892ndash2906 2016

[32] X Wang L Gao S Mao and S Pandey ldquoCSI-Based Finger-printing for Indoor Localization A Deep Learning ApproachrdquoIEEE Transactions on Vehicular Technology vol 66 no 1 pp763ndash776 2017

[33] Y Hu andG Leus ldquoRobust differential received signal strength-based localizationrdquo IEEE Transactions on Signal Processing vol65 no 12 pp 3261ndash3276 2017

[34] T-H Do and M Yoo ldquoTDOA-based indoor positioning usingvisible lightrdquo Photonic Network Communications vol 27 no 2pp 80ndash88 2014

[35] S-Y Jung S Hann and C-S Park ldquoTDOA-based opticalwireless indoor localization using LED ceiling lampsrdquo IEEETransactions on Consumer Electronics vol 57 no 4 pp 1592ndash1597 2011

[36] U Nadeem N U Hassan M A Pasha and C Yuen ldquoHighlyaccurate 3D wireless indoor positioning system using whiteLED lightsrdquo IEEE Electronics Letters vol 50 no 11 pp 828ndash8302014

[37] TQWang Y A Sekercioglu ANeild and J Armstrong ldquoPosi-tion accuracy of time-of-arrival based ranging using visiblelight with application in indoor localization systemsrdquo Journalof Lightwave Technology vol 31 no 20 pp 3302ndash3308 2013

[38] O Ennasr G Xing and X Tan ldquoDistributed time-difference-of-arrival (TDOA)-based localization of a moving targetrdquo inProceedings of the 55th IEEEConference onDecision andControlCDC 2016 pp 2652ndash2658 USA December 2016

[39] S H Yang E M Jeong and S K Han ldquoIndoor positioningbased on received optical power difference by angle of arrivalrdquoIEEE Electronics Letters vol 50 no 1 pp 49ndash51 2014

[40] A Arafa X Jin and R Klukas ldquoWireless indoor opticalpositioning with a differential photosensorrdquo IEEE PhotonicsTechnology Letters vol 24 no 12 pp 1027ndash1029 2012

[41] M T Hossan M Z Chowdhury M K Hasan et al ldquoA NewVehicle Localization SchemeBased onCombinedOptical Cam-era Communication and PhotogrammetryrdquoMobile InformationSystems vol 2018 Article ID 8501898 14 pages 2018

[42] S M Berman D S Greenhouse I L Bailey R D Clearand T W Raasch ldquoHuman electroretinogram responses tovideo displays fluorescent lighting and other high frequencysourcesrdquo Optometry and Vision Science vol 68 no 8 pp 645ndash662 1991

Wireless Communications and Mobile Computing 13

[43] H-Y Lee H-M Lin Y-L Wei H-I Wu H-M Tsai andK C-J Lin ldquoRollingLight Enabling line-of-sight light-to-camera communicationsrdquo in Proceedings of the 13th AnnualInternational Conference on Mobile Systems Applications andServices MobiSys 2015 pp 167ndash180 Italy May 2015

[44] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000

[45] HAMartins J R Birk andR B Kelley ldquoCameramodels basedon data from two calibration planesrdquo Computer Graphics andImage Processing vol 17 no 2 pp 173ndash180 1981

[46] E Arthurs andH Dym ldquoOn the OptimumDetection of DigitalSignals in the Presence ofWhite Gaussian NoisemdashAGeometricInterpretation and a Study of Three Basic Data TransmissionSystemsrdquo IRE Transactions on Communications Systems vol 10no 4 pp 336ndash372 1962

International Journal of

AerospaceEngineeringHindawiwwwhindawicom Volume 2018

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Active and Passive Electronic Components

VLSI Design

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Shock and Vibration

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawiwwwhindawicom

Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Control Scienceand Engineering

Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

SensorsJournal of

Hindawiwwwhindawicom Volume 2018

International Journal of

RotatingMachinery

Hindawiwwwhindawicom Volume 2018

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Navigation and Observation

International Journal of

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

Submit your manuscripts atwwwhindawicom

Page 11: An Implementation Approach and Performance …...Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md.Shahjalal ,Md.TanvirHossan ,Moh.KhalidHasan

Wireless Communications and Mobile Computing 11

CoordinateBits(11)00000(12)00001(21)00010(22)00011(13)00100

Figure 13 Implemented android application receiving multipleLEDs ID simultaneously

1000 ms is required to debug and getting device responsewhen communicating with 4 LEDs simultaneously As eachLED is transmitting 10 bits we got data at 40 bps rate for 4LEDs with this device

8 Conclusion

This work focuses on the reception of data from multipleindoor LED transmitters as well as on the localization andnavigation of the MR processing the received data Theindoor LED locations are received as binary data of five bitsper link at a data rate of 40 bps The proposed algorithmwas implemented in an indoor platform where four LEDshaving sizes of 71 cm2 were kept at a distance of 50 cmfrom each other The four LEDs transmit four different IDsand an android application that was developed to receivethese IDs performed well simultaneously We were able tocommunicate with the LEDs from a maximum distance of480 cm As our positioning approach requires measurementof the direct distance from the camera to the LED wecompare the accuracy of measuring the distance at differentfloor positions Maintaining the floor height to be constantwe experience an increase in the percentage of error whenthe camera moves a long distance horizontally To mitigatethis error we fix some recorded distance values within theapplication to compareWe have foundmaximum 2 cm errorat 100 cm horizontal distance from the floor position of thetarget LED when the MR moves horizontally toward thatLED Thus the accuracy is improved from the localization

and navigation viewpoint as well as based on the distance forindoor communication

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that they have no conflicts of interest

Acknowledgments

This research was supported by the MSIT (Ministry ofScience and ICT) Korea under the Global IT Talent sup-port program (IITP-2017-0-01806) supervised by the IITP(Institute for Information and Communication Technol-ogy Promotion) and the National Research Foundationof Korea (NRF) grant funded by the Korea government(Performance Enhancement and Resource Management ofVisual MIMO-based Camera Communication Networks)(No NRF-2016R1D1A1A09919091)

References

[1] M Z Chowdhury M T Hossan A Islam and Y M Jang ldquoAComparative Survey of Optical Wireless Technologies Archi-tectures and Applicationsrdquo IEEE Access vol 6 pp 9819ndash98402018

[2] M T Hossan M Z Chowdhury A Islam and Y M JangldquoA Novel Indoor Mobile Localization System Based on Opti-cal Camera Communicationrdquo Wireless Communications andMobile Computing vol 2018 17 pages 2018

[3] A Khalajmehrabadi N Gatsis and D Akopian ldquoMod-ern WLAN Fingerprinting Indoor Positioning Methods andDeployment Challengesrdquo IEEE Communications Surveys ampTutorials vol 19 no 3 pp 1974ndash2002 2017

[4] E Kupershtein M Wax and I Cohen ldquoSingle-site emitterlocalization via multipath fingerprintingrdquo IEEE Transactions onSignal Processing vol 61 no 1 pp 10ndash21 2013

[5] P Luo et al ldquoExperimental demonstration of RGB LED-basedoptical camera communicationsrdquo IEEE Photonics Journal vol 7no 5 pp 1ndash12 2015

[6] R D Roberts ldquoUndersampled frequency shift ON-OFF key-ing (UFSOOK) for camera communications (CamCom)rdquo inProceedings of the 22nd Wireless and Optical CommunicationsConference WOCC 2013 pp 645ndash648 China May 2013

[7] P Luo Z Ghassemlooy H L Minh H Tsai and X TangldquoUndersampled-PAM with subcarrier modulation for cameracommunicationsrdquo in Proceedings of the 20th Opto-Electronicsand Communications Conference (OECC) pp 1ndash3 ShanghaiChina 2015

[8] Y-Y Won S-H Yang D-H Kim and S-K Han ldquoThree-dimensional optical wireless indoor positioning system usinglocation code map based on power distribution of visible lightemitting dioderdquo IET Optoelectronics vol 7 no 3 pp 77ndash832013

[9] C Yang andH-R Shao ldquoWiFi-based indoor positioningrdquo IEEECommunications Magazine vol 53 no 3 pp 150ndash157 2015

12 Wireless Communications and Mobile Computing

[10] R Xu W Chen Y Xu and S Ji ldquoA New Indoor PositioningSystem Architecture Using GPS Signalsrdquo Sensors vol 15 no 5pp 10074ndash10087 2015

[11] Z Zhou M Kavehrad and P Deng ldquoIndoor positioning algo-rithm using light-emitting diode visible light communicationsrdquoOptical Engineering vol 51 no 8 2012

[12] W Zhang M I S Chowdhury and M Kavehrad ldquoAsyn-chronous indoor positioning system based on visible lightcommunicationsrdquo Optical Engineering vol 53 no 4 Article ID045105 2014

[13] S-Y Jung S Hann S Park and C-S Park ldquoOptical wirelessindoor positioning system using light emitting diode ceilinglightsrdquo Microwave and Optical Technology Letters vol 54 no7 pp 1622ndash1626 2012

[14] H-S Kim D-R Kim S-H Yang Y-H Son and S-K HanldquoAn indoor visible light communication positioning systemusing a RF carrier allocation techniquerdquo Journal of LightwaveTechnology vol 31 no 1 pp 134ndash144 2013

[15] K Panta and J Armstrong ldquoIndoor localisation using whiteLEDsrdquo IEEE Electronics Letters vol 48 no 4 pp 228ndash230 2012

[16] A D Cheok and L Yue ldquoA novel light-sensor-based informa-tion transmission system for indoor positioning and naviga-tionrdquo IEEE Transactions on Instrumentation and Measurementvol 60 no 1 pp 290ndash299 2011

[17] M Nakajima and S Haruyama ldquoNew indoor navigation systemfor visually impaired people using visible light communicationrdquoEURASIP Journal onWireless Communications and Networkingvol 2013 article 37 2013

[18] H Huang A Yang L Feng G Ni and P Guo ldquoIndoorPositioning Method Based on Metameric White Light Sourcesand Subpixels on aColor Image Sensorrdquo IEEEPhotonics Journalvol 8 no 6 2016

[19] M S Hossen Y Park and K Kim ldquoPerformance improvementof indoor positioning using light-emitting diodes and an imagesensor for light-emitting diode communicationrdquo Optical Engi-neering vol 54 no 3 pp 1ndash11 2015

[20] M-G Moon S-I Choi J Park and J Y Kim ldquoIndoorpositioning system using LED lights and a dual image sensorrdquoJournal of the Optical Society of Korea vol 19 no 6 pp 586ndash5912015

[21] Y-S Kuo P Pannuto K-J Hsiao and P Dutta ldquoLuxaposeIndoor positioning with mobile phones and visible lightrdquo inProceedings of the 20th ACM Annual International ConferenceonMobile Computing and Networking MobiCom 2014 pp 447ndash458 USA September 2014

[22] P Huynh and M Yoo ldquoVLC-based positioning system for anindoor environment using an image sensor and an accelerome-ter sensorrdquo Sensors vol 16 no 6 2016

[23] H Steendam T QWang and J Armstrong ldquoTheoretical lowerbound for indoor visible light positioning using received signalstrengthmeasurements and an aperture-based receiverrdquo Journalof Lightwave Technology vol 35 no 2 pp 309ndash319 2017

[24] F Yin Y Zhao F Gunnarsson and F Gustafsson ldquoReceived-signal-strength threshold optimization using Gaussian pro-cessesrdquo IEEE Transactions on Signal Processing vol 65 no 8pp 2164ndash2177 2017

[25] X Piao Y Zhang T Li et al ldquoRSS fingerprint based indoorlocalization using sparse representation with spatio-temporalconstraintrdquo Sensors vol 16 no 11 2016

[26] C Feng W S A Au S Valaee and Z Tan ldquoReceived-signal-strength-based indoor positioning using compressive sensingrdquo

IEEETransactions onMobile Computing vol 11 no 12 pp 1983ndash1993 2012

[27] W Gu W Zhang M Kavehrad and L Feng ldquoThree-dimensional light positioning algorithm with filtering tech-niques for indoor environmentsrdquo Optical Engineering vol 53no 10 2014

[28] S-H Yang E-M Jung and S-K Han ldquoIndoor locationestimation based on LED visible light communication usingmultiple optical receiversrdquo IEEE Communications Letters vol17 no 9 pp 1834ndash1837 2013

[29] F I K Mousa H Le-Minh Z Ghassemlooy et al ldquoIndoorlocalization system utilizing two visible light emitting diodesrdquoOptical Engineering vol 55 no 11 2016

[30] K Wang A Nirmalathas C Lim K Alameh and E SkafidasldquoOptical wireless-based indoor localization system employing asingle-channel imaging receiverrdquo Journal of Lightwave Technol-ogy vol 34 no 4 pp 1141ndash1149 2016

[31] S-H Jung B-CMoon andDHan ldquoUnsupervised learning forcrowdsourced indoor localization in wireless networksrdquo IEEETransactions on Mobile Computing vol 15 no 11 pp 2892ndash2906 2016

[32] X Wang L Gao S Mao and S Pandey ldquoCSI-Based Finger-printing for Indoor Localization A Deep Learning ApproachrdquoIEEE Transactions on Vehicular Technology vol 66 no 1 pp763ndash776 2017

[33] Y Hu andG Leus ldquoRobust differential received signal strength-based localizationrdquo IEEE Transactions on Signal Processing vol65 no 12 pp 3261ndash3276 2017

[34] T-H Do and M Yoo ldquoTDOA-based indoor positioning usingvisible lightrdquo Photonic Network Communications vol 27 no 2pp 80ndash88 2014

[35] S-Y Jung S Hann and C-S Park ldquoTDOA-based opticalwireless indoor localization using LED ceiling lampsrdquo IEEETransactions on Consumer Electronics vol 57 no 4 pp 1592ndash1597 2011

[36] U Nadeem N U Hassan M A Pasha and C Yuen ldquoHighlyaccurate 3D wireless indoor positioning system using whiteLED lightsrdquo IEEE Electronics Letters vol 50 no 11 pp 828ndash8302014

[37] TQWang Y A Sekercioglu ANeild and J Armstrong ldquoPosi-tion accuracy of time-of-arrival based ranging using visiblelight with application in indoor localization systemsrdquo Journalof Lightwave Technology vol 31 no 20 pp 3302ndash3308 2013

[38] O Ennasr G Xing and X Tan ldquoDistributed time-difference-of-arrival (TDOA)-based localization of a moving targetrdquo inProceedings of the 55th IEEEConference onDecision andControlCDC 2016 pp 2652ndash2658 USA December 2016

[39] S H Yang E M Jeong and S K Han ldquoIndoor positioningbased on received optical power difference by angle of arrivalrdquoIEEE Electronics Letters vol 50 no 1 pp 49ndash51 2014

[40] A Arafa X Jin and R Klukas ldquoWireless indoor opticalpositioning with a differential photosensorrdquo IEEE PhotonicsTechnology Letters vol 24 no 12 pp 1027ndash1029 2012

[41] M T Hossan M Z Chowdhury M K Hasan et al ldquoA NewVehicle Localization SchemeBased onCombinedOptical Cam-era Communication and PhotogrammetryrdquoMobile InformationSystems vol 2018 Article ID 8501898 14 pages 2018

[42] S M Berman D S Greenhouse I L Bailey R D Clearand T W Raasch ldquoHuman electroretinogram responses tovideo displays fluorescent lighting and other high frequencysourcesrdquo Optometry and Vision Science vol 68 no 8 pp 645ndash662 1991

Wireless Communications and Mobile Computing 13

[43] H-Y Lee H-M Lin Y-L Wei H-I Wu H-M Tsai andK C-J Lin ldquoRollingLight Enabling line-of-sight light-to-camera communicationsrdquo in Proceedings of the 13th AnnualInternational Conference on Mobile Systems Applications andServices MobiSys 2015 pp 167ndash180 Italy May 2015

[44] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000

[45] HAMartins J R Birk andR B Kelley ldquoCameramodels basedon data from two calibration planesrdquo Computer Graphics andImage Processing vol 17 no 2 pp 173ndash180 1981

[46] E Arthurs andH Dym ldquoOn the OptimumDetection of DigitalSignals in the Presence ofWhite Gaussian NoisemdashAGeometricInterpretation and a Study of Three Basic Data TransmissionSystemsrdquo IRE Transactions on Communications Systems vol 10no 4 pp 336ndash372 1962

International Journal of

AerospaceEngineeringHindawiwwwhindawicom Volume 2018

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Active and Passive Electronic Components

VLSI Design

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Shock and Vibration

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawiwwwhindawicom

Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Control Scienceand Engineering

Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

SensorsJournal of

Hindawiwwwhindawicom Volume 2018

International Journal of

RotatingMachinery

Hindawiwwwhindawicom Volume 2018

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Navigation and Observation

International Journal of

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

Submit your manuscripts atwwwhindawicom

Page 12: An Implementation Approach and Performance …...Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md.Shahjalal ,Md.TanvirHossan ,Moh.KhalidHasan

12 Wireless Communications and Mobile Computing

[10] R Xu W Chen Y Xu and S Ji ldquoA New Indoor PositioningSystem Architecture Using GPS Signalsrdquo Sensors vol 15 no 5pp 10074ndash10087 2015

[11] Z Zhou M Kavehrad and P Deng ldquoIndoor positioning algo-rithm using light-emitting diode visible light communicationsrdquoOptical Engineering vol 51 no 8 2012

[12] W Zhang M I S Chowdhury and M Kavehrad ldquoAsyn-chronous indoor positioning system based on visible lightcommunicationsrdquo Optical Engineering vol 53 no 4 Article ID045105 2014

[13] S-Y Jung S Hann S Park and C-S Park ldquoOptical wirelessindoor positioning system using light emitting diode ceilinglightsrdquo Microwave and Optical Technology Letters vol 54 no7 pp 1622ndash1626 2012

[14] H-S Kim D-R Kim S-H Yang Y-H Son and S-K HanldquoAn indoor visible light communication positioning systemusing a RF carrier allocation techniquerdquo Journal of LightwaveTechnology vol 31 no 1 pp 134ndash144 2013

[15] K Panta and J Armstrong ldquoIndoor localisation using whiteLEDsrdquo IEEE Electronics Letters vol 48 no 4 pp 228ndash230 2012

[16] A D Cheok and L Yue ldquoA novel light-sensor-based informa-tion transmission system for indoor positioning and naviga-tionrdquo IEEE Transactions on Instrumentation and Measurementvol 60 no 1 pp 290ndash299 2011

[17] M Nakajima and S Haruyama ldquoNew indoor navigation systemfor visually impaired people using visible light communicationrdquoEURASIP Journal onWireless Communications and Networkingvol 2013 article 37 2013

[18] H Huang A Yang L Feng G Ni and P Guo ldquoIndoorPositioning Method Based on Metameric White Light Sourcesand Subpixels on aColor Image Sensorrdquo IEEEPhotonics Journalvol 8 no 6 2016

[19] M S Hossen Y Park and K Kim ldquoPerformance improvementof indoor positioning using light-emitting diodes and an imagesensor for light-emitting diode communicationrdquo Optical Engi-neering vol 54 no 3 pp 1ndash11 2015

[20] M-G Moon S-I Choi J Park and J Y Kim ldquoIndoorpositioning system using LED lights and a dual image sensorrdquoJournal of the Optical Society of Korea vol 19 no 6 pp 586ndash5912015

[21] Y-S Kuo P Pannuto K-J Hsiao and P Dutta ldquoLuxaposeIndoor positioning with mobile phones and visible lightrdquo inProceedings of the 20th ACM Annual International ConferenceonMobile Computing and Networking MobiCom 2014 pp 447ndash458 USA September 2014

[22] P Huynh and M Yoo ldquoVLC-based positioning system for anindoor environment using an image sensor and an accelerome-ter sensorrdquo Sensors vol 16 no 6 2016

[23] H Steendam T QWang and J Armstrong ldquoTheoretical lowerbound for indoor visible light positioning using received signalstrengthmeasurements and an aperture-based receiverrdquo Journalof Lightwave Technology vol 35 no 2 pp 309ndash319 2017

[24] F Yin Y Zhao F Gunnarsson and F Gustafsson ldquoReceived-signal-strength threshold optimization using Gaussian pro-cessesrdquo IEEE Transactions on Signal Processing vol 65 no 8pp 2164ndash2177 2017

[25] X Piao Y Zhang T Li et al ldquoRSS fingerprint based indoorlocalization using sparse representation with spatio-temporalconstraintrdquo Sensors vol 16 no 11 2016

[26] C Feng W S A Au S Valaee and Z Tan ldquoReceived-signal-strength-based indoor positioning using compressive sensingrdquo

IEEETransactions onMobile Computing vol 11 no 12 pp 1983ndash1993 2012

[27] W Gu W Zhang M Kavehrad and L Feng ldquoThree-dimensional light positioning algorithm with filtering tech-niques for indoor environmentsrdquo Optical Engineering vol 53no 10 2014

[28] S-H Yang E-M Jung and S-K Han ldquoIndoor locationestimation based on LED visible light communication usingmultiple optical receiversrdquo IEEE Communications Letters vol17 no 9 pp 1834ndash1837 2013

[29] F I K Mousa H Le-Minh Z Ghassemlooy et al ldquoIndoorlocalization system utilizing two visible light emitting diodesrdquoOptical Engineering vol 55 no 11 2016

[30] K Wang A Nirmalathas C Lim K Alameh and E SkafidasldquoOptical wireless-based indoor localization system employing asingle-channel imaging receiverrdquo Journal of Lightwave Technol-ogy vol 34 no 4 pp 1141ndash1149 2016

[31] S-H Jung B-CMoon andDHan ldquoUnsupervised learning forcrowdsourced indoor localization in wireless networksrdquo IEEETransactions on Mobile Computing vol 15 no 11 pp 2892ndash2906 2016

[32] X Wang L Gao S Mao and S Pandey ldquoCSI-Based Finger-printing for Indoor Localization A Deep Learning ApproachrdquoIEEE Transactions on Vehicular Technology vol 66 no 1 pp763ndash776 2017

[33] Y Hu andG Leus ldquoRobust differential received signal strength-based localizationrdquo IEEE Transactions on Signal Processing vol65 no 12 pp 3261ndash3276 2017

[34] T-H Do and M Yoo ldquoTDOA-based indoor positioning usingvisible lightrdquo Photonic Network Communications vol 27 no 2pp 80ndash88 2014

[35] S-Y Jung S Hann and C-S Park ldquoTDOA-based opticalwireless indoor localization using LED ceiling lampsrdquo IEEETransactions on Consumer Electronics vol 57 no 4 pp 1592ndash1597 2011

[36] U Nadeem N U Hassan M A Pasha and C Yuen ldquoHighlyaccurate 3D wireless indoor positioning system using whiteLED lightsrdquo IEEE Electronics Letters vol 50 no 11 pp 828ndash8302014

[37] TQWang Y A Sekercioglu ANeild and J Armstrong ldquoPosi-tion accuracy of time-of-arrival based ranging using visiblelight with application in indoor localization systemsrdquo Journalof Lightwave Technology vol 31 no 20 pp 3302ndash3308 2013

[38] O Ennasr G Xing and X Tan ldquoDistributed time-difference-of-arrival (TDOA)-based localization of a moving targetrdquo inProceedings of the 55th IEEEConference onDecision andControlCDC 2016 pp 2652ndash2658 USA December 2016

[39] S H Yang E M Jeong and S K Han ldquoIndoor positioningbased on received optical power difference by angle of arrivalrdquoIEEE Electronics Letters vol 50 no 1 pp 49ndash51 2014

[40] A Arafa X Jin and R Klukas ldquoWireless indoor opticalpositioning with a differential photosensorrdquo IEEE PhotonicsTechnology Letters vol 24 no 12 pp 1027ndash1029 2012

[41] M T Hossan M Z Chowdhury M K Hasan et al ldquoA NewVehicle Localization SchemeBased onCombinedOptical Cam-era Communication and PhotogrammetryrdquoMobile InformationSystems vol 2018 Article ID 8501898 14 pages 2018

[42] S M Berman D S Greenhouse I L Bailey R D Clearand T W Raasch ldquoHuman electroretinogram responses tovideo displays fluorescent lighting and other high frequencysourcesrdquo Optometry and Vision Science vol 68 no 8 pp 645ndash662 1991

Wireless Communications and Mobile Computing 13

[43] H-Y Lee H-M Lin Y-L Wei H-I Wu H-M Tsai andK C-J Lin ldquoRollingLight Enabling line-of-sight light-to-camera communicationsrdquo in Proceedings of the 13th AnnualInternational Conference on Mobile Systems Applications andServices MobiSys 2015 pp 167ndash180 Italy May 2015

[44] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000

[45] HAMartins J R Birk andR B Kelley ldquoCameramodels basedon data from two calibration planesrdquo Computer Graphics andImage Processing vol 17 no 2 pp 173ndash180 1981

[46] E Arthurs andH Dym ldquoOn the OptimumDetection of DigitalSignals in the Presence ofWhite Gaussian NoisemdashAGeometricInterpretation and a Study of Three Basic Data TransmissionSystemsrdquo IRE Transactions on Communications Systems vol 10no 4 pp 336ndash372 1962

International Journal of

AerospaceEngineeringHindawiwwwhindawicom Volume 2018

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Active and Passive Electronic Components

VLSI Design

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Shock and Vibration

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawiwwwhindawicom

Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Control Scienceand Engineering

Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

SensorsJournal of

Hindawiwwwhindawicom Volume 2018

International Journal of

RotatingMachinery

Hindawiwwwhindawicom Volume 2018

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Navigation and Observation

International Journal of

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

Submit your manuscripts atwwwhindawicom

Page 13: An Implementation Approach and Performance …...Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md.Shahjalal ,Md.TanvirHossan ,Moh.KhalidHasan

Wireless Communications and Mobile Computing 13

[43] H-Y Lee H-M Lin Y-L Wei H-I Wu H-M Tsai andK C-J Lin ldquoRollingLight Enabling line-of-sight light-to-camera communicationsrdquo in Proceedings of the 13th AnnualInternational Conference on Mobile Systems Applications andServices MobiSys 2015 pp 167ndash180 Italy May 2015

[44] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000

[45] HAMartins J R Birk andR B Kelley ldquoCameramodels basedon data from two calibration planesrdquo Computer Graphics andImage Processing vol 17 no 2 pp 173ndash180 1981

[46] E Arthurs andH Dym ldquoOn the OptimumDetection of DigitalSignals in the Presence ofWhite Gaussian NoisemdashAGeometricInterpretation and a Study of Three Basic Data TransmissionSystemsrdquo IRE Transactions on Communications Systems vol 10no 4 pp 336ndash372 1962

International Journal of

AerospaceEngineeringHindawiwwwhindawicom Volume 2018

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Active and Passive Electronic Components

VLSI Design

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Shock and Vibration

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawiwwwhindawicom

Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Control Scienceand Engineering

Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

SensorsJournal of

Hindawiwwwhindawicom Volume 2018

International Journal of

RotatingMachinery

Hindawiwwwhindawicom Volume 2018

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Navigation and Observation

International Journal of

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

Submit your manuscripts atwwwhindawicom

Page 14: An Implementation Approach and Performance …...Analysis of Image Sensor Based Multilateral Indoor Localization and Navigation System Md.Shahjalal ,Md.TanvirHossan ,Moh.KhalidHasan

International Journal of

AerospaceEngineeringHindawiwwwhindawicom Volume 2018

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Active and Passive Electronic Components

VLSI Design

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Shock and Vibration

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawiwwwhindawicom

Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Control Scienceand Engineering

Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

SensorsJournal of

Hindawiwwwhindawicom Volume 2018

International Journal of

RotatingMachinery

Hindawiwwwhindawicom Volume 2018

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Navigation and Observation

International Journal of

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

Submit your manuscripts atwwwhindawicom