• Non ci sono risultati.

Pupillometry via smartphone for low-resource settings

N/A
N/A
Protected

Academic year: 2021

Condividi "Pupillometry via smartphone for low-resource settings"

Copied!
12
0
0

Testo completo

(1)

1

2

Original Research Article

4

Pupillometry via smartphone for low-resource

5Q1

settings

6 7

8

Davide Piaggio

a,*

, Georgy Namm

a

, Paolo Melillo

b

, Francesca Simonelli

b

,

9Q2

Ernesto Iadanza

c

, Leandro Pecchia

a

10 aApplied Biomedical Signal Processing Intelligent eHealth Lab, School of Engineering, University of Warwick, Coventry CV47AL,

11 United Kingdom

12 bEye Clinic, Multidisciplinary Department of Medical, Surgical and Dental Sciences, Universita` degli Studi della Campania ‘Luigi

13 Vanvitelli’, Naples, Italy

14 cDepartment of Information Engineering, University of Florence, Florence, Italy

15

16

1 8 A R T I C L E I N F O

19 Article history:

20 Received 8 December 2020

21 Received in revised form

22 5 May 2021 23 Accepted 28 May 2021 24 Available online xxxx 25 Keywords: 26 Clinical engineering 27 Low-resource settings 28 Pupillometer 29 mHealth 30 Contextual Design 31 Frugal Engineering 32 3 3 A B S T R A C T

34 The photopupillary reflex regulates the pupil reaction to changing light conditions. Being

35 controlled by the autonomic nervous system, it is a proxy for brain trauma and for the

con-36 ditions of patients in critical care. A prompt evaluation of brain traumas can save lives.

37 With a simple penlight, skilled clinicians can do that, whereas less specialized ones have

38 to resort to a digital pupilometer. However, many low-income countries lack both

special-39 ized clinicians and digital pupilometers.

40 This paper presents the early results of our study aiming at designing, prototyping and

41 validating an app for testing the photopupillary reflex via Android, following the European

42 Medical Device Regulation and relevant standards.

43 After a manual validation, the prototype underwent a technical validation against a

com-44 mercial Infrared pupilometer. As a result, the proposed app performed as well as the

man-45 ual measurements and better than the commercial solution, with lower errors, higher and

46 significant correlations, and significantly better Bland-Altman plots for all the

47 pupillometry-related measures.

48 The design of this medical device was performed based on our expertise in low-resource

49 settings. This kind of environments imposes more stringent design criteria due to

contex-50 tual challenges, including the lack of specialized clinicians, funds, spare parts and

consum-51 ables, poor maintenance, and harsh environmental conditions, which may hinder the safe

52 operationalization of medical devices. This paper provides an overview of how these

53 unique contextual characteristics are cascaded into the design of an app in order to

con-54 tribute to the Sustainable Development Goal 3 of the World Health Organization: Good

55 health and well-being.

https://doi.org/10.1016/j.bbe.2021.05.012

0168-8227/Ó 2021 The Author(s). Published by Elsevier B.V. on behalf of Nalecz Institute of Biocybernetics and Biomedical Engineering of the Polish Academy of Sciences.

This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).

Abbreviations: IFMBE, International Federation of Medical and Biological Engineering; IR, InfraRed; LMIC, Low- and Middle-Income Country; LRS, Low-Resource Setting; MAE, Mean Absolute Error; RMSE, Root Mean Square Error; SSA, Sub-Saharan Africa; SDG, Sustainable Development Goals; UN, United Nations; WHO, World Health Organization

* Corresponding author at: Applied Biomedical Signal Processing Intelligent eHealth Lab, School of Engineering, University of Warwick, Coventry CV47AL, United Kingdom.

E-mail address:d.piaggio@warwick.ac.uk(D. Piaggio).

A v a i l a b l e a t

w w w . s c i e n c e d i r e c t . c o m

ScienceDirect

(2)

56 Ó 2021 The Author(s). Published by Elsevier B.V. on behalf of Nalecz Institute of

Biocyber-57 netics and Biomedical Engineering of the Polish Academy of Sciences. This is an open access

58 article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).

59 60

61

62

1.

Background

63 The photopupillary reflex regulates the pupil dilation and

64 constriction according to the intensity of the light that hits

65 the retina and is controlled by the sympathetic and

parasym-66 pathetic nervous systems. Therefore, this reflex is used as an

67 indirect measure of the central and autonomic nervous

sys-68 tem[1]. The key medical applications of the photopupillary

69 reflex measurements include the detection of brain trauma

70 and the assessment of its severity[2,3], the assessment of

71 the level of anesthesia and pain[4,5], an aid to the

certifica-72 tion of death[6], the evaluation of alcohol[7]and drug

intox-73 ication[8,9], and the study of ophthalmological diseases such

74 as diabetic retinopathy and Horner’s syndrome[1,10]. A quick

75 evaluation of brain trauma via pupillometry, i.e., the

mea-76 surement of pupil size, symmetry and reactivity, can make

77 the difference on the patient’s health and future life and is

78 an essential part of the supportive care provided in this case

79 [11,12]. The early management of traumatic brain injury, in

80 fact, minimizes the progression of the injury and improves

81 recovery and clinical outcomes[12,13]. Accordingly, in many

82 high-income countries, technologies for photopupillary reflex

83 analysis have been proposed[14–17], also using smartphones

84 [18–21]. Recently, an app for tracking the photopupillary reflex

85 using trained object-detectors was introduced[22]. As regards

86 the pupil and iris detection algorithms, there are various

tech-87 nical solutions available including edge detection and Hough

88 transform[23], Starburst transform[24], blob detection

algo-89 rithms[25], watershed segmentation[26], gradient vector flow

90 snake-based method[27], and deep learning[28].

91 However, very little has been proposed for low- and

92 middle-income countries (LMICs), where traumatic brain

93 injury is becoming one of the main causes of morbidity and

94 mortality. In fact, Africa owns less than 5% of the motor

vehi-95 cles in the world and accounts for 10% of global deaths caused

96 by vehicular injuries[29]. In LMICs, and in particular in

low-97 resource settings (LRSs), there is a lack of expertise and

diag-98 nostics to assess brain trauma[30]. Accordingly, the United

99 Nations (UN) aims to ‘‘halve the number of global deaths

100 and injuries from road traffic accidents”. This is target 3.6 of

101 the UN Sustainable Development Goals (SDGs) number 3,

102 Good health and well-being[31].

103 The photopupillary reflex can be measured with a simple

104 penlight. Despite the simplicity of the device, accurate and

105 reliable assessments of the photopupillary reflex require an

106 experienced user: Couret et al. [32]demonstrated that the

107 penlight photopupillary reflex observation in neurocritical

108 care is prone to human error, limited reproducibility and

109 low precision. In many LMICs, diagnosis and healthcare

deliv-110 ery is hindered by the lack of specialized clinicians, alongside

111 the lack of resources and poor supply chain[33]. An

alterna-112

tive is the digital pupillometer, i.e., a medical device

perform-113

ing automated pupillometry using infrared cameras, which

114

are expensive and not designed (i.e., not resilient) to operate

115

in the harsh environments (i.e., dusty, warm, humid, with

116

unstable power supply etc.) typical of Sub-Saharan Africa

117

(SSA).

118

This article presents the early results of our study aimed at

119

designing, prototyping and validating a mobile app, based on

120

relevant international and military standards, for testing the

121

photopupillary reflex via Android in LRSs.

122

The aim of this app is to act as a screening tool that can be

123

used by nurses (or also lay-users) to test the direct pupillary

124

reflex in order to screen the incoming patients’ conditions

125

(e.g., suspected presence of brain injuries) and plan further

126

investigations. This is crucial in LRSs.

127

Specifically, this paper describes the acquisition of videos,

128

the signal processing and their technical validation. The

129

results from eight field studies in SSA have informed the

con-130

textualized and user-driven design, and can also be relevant

131

for informing the design of other devices for LMICs. In fact,

132

we added additional design criteria, due to the challenges

133

typical of SSA, which included the lack of specialized

clini-134

cians, the scarcity of funds, of spare parts and consumables,

135

poor maintenance, which hinder the safe and efficient

opera-136

tionalization of medical devices. This paper demonstrates

137

how these peculiar contextual characteristics can be

cas-138

caded into the design of a mobile app and redesign of a

med-139

ical device.

140

This work was inspired and informed by existent

regula-141

tions and standards. In particular, those related to existing

142

pupillometers were taken into consideration, because of their

143

similarity to our solution. Further punctual analysis of

stan-144

dards and requirements will be needed in the later stages to

145

pass from prototype to product.

146

2.

Methods

147

2.1. Ethnography-driven user-need and contextual

148 analysis in LMICs

149

Designing medical devices for LRSs requires the synergy of

150

different but complementary methodologies, comprising of

151

not only engineering, scientific and quantitative techniques,

152

but also qualitative approaches such as ethnography research

153

[34]. Ethnography applied to the design is, in fact, one of the

154

keys to further develop the current technological progress,

155

by allowing designers and researchers to understand the

156

design challenges more deeply, with a focus on a particular

157

kind of end-users and their surrounding contexts. For this

158

reason, we conducted the need and context analyses with a

159

(3)

160 They were structured in three steps, which were iterated

161 twice: general formalization, contextualization in SSA

coun-162 tries, and field studies in Benin and Uganda.

163 The first requirements were identified by reviewing the

lit-164 erature on medical devices and their related standards (i.e.,

165 ISO 14971 – Medical devices – Application of risk management

166 to medical devices, IEC 62366 – Medical devices—Part 1:

Appli-167 cation of usability engineering to medical devices, IEC 62304 –

168 Medical device software—Software life cycle processes, ISO

169 15004 – Ophthalmic instruments—Fundamental

require-170 ments and test methods), and performing focus groups with

171 international experts of medical device design and

manage-172 ment, and hospital engineering. Five focus groups were held

173 with world leading experts of biomedical and clinical

engi-174 neering during international conferences[35–39].

175 The contextualization in SSA was performed by: 1)

admin-176 istering surveys to African Scholars; 2) holding focus groups

177 with biomedical and clinical engineers in SSA countries (in

178 accordance with the ethical approval REGO-2018-2283). Five

179 focus groups were held in SSA and were attended by delegates

180 from more than 12 SSA countries (two during the

Africa-181 Health conferences, two in Benin at the Ecole Polytechnique

182 d’Abomey-Calavi, and one in Ethiopia)[40].

183 Three field studies were conducted in Benin in April 2017,

184 January 2018 and November 2019 and one in Uganda in

Octo-185 ber 2019. During these studies, several aspects of medical

186 devices and medical locations were analyzed. This included

187 electric measurements, examinations of medical devices,

188 inspections of medical locations in 6 African hospitals and

189 semi-structured interviews with the available staff, including

190

biomedical engineers, technicians, nurses, doctors and

hospi-191

tal directors[41–43].

192

In collaboration with the International Federation of

Med-193

ical and Biological Engineering (IFMBE) African Working

194

Group[40], focus groups were organized.

195

The ethnographic analysis was conducted in accordance

196

with the ethical approval REGO-2018-2283, obtained from

197

the Biomedical and Scientific Research Ethics Committee.

198

For quality insurance, we followed the prescriptions of the

199

European regulations on medical devices, which equate

med-200

ical apps to medical devices. Moreover, we based our work on

201

the 5As principles of the World Health Organization (WHO),

202

i.e., affordability, availability, adequacy, accessibility, and

203

appropriateness, in line with the solutions proposed in the

204

WHO compendium of innovative health technologies for

205

LRSs.

206

2.2. Development of the smartphone-based pupillometer

207

The development of the smartphone-based pupillometer

fol-208

lowed 5 stages, namely the smartphone pupil stimulation and

209

video acquisition, the preprocessing, the image processing,

210

the system integration and the technical validation, which

211

will be described thoroughly in the following subsections.

212

These stages were developed and validated with videos

213

acquired from 11 healthy subjects in accordance with the

eth-214

ical approval obtained from the Ethical Committee of

Univer-215

sity of Campania Luigi Vanvitelli1. Further details about the

216

dataset can be found elsewhere[14].

217 2.2.1. Smartphone pupil stimulation and video acquisition

218

During the first feasibility study, the pupil of a subject with

219

light brown eyes was stimulated with the flash embedded in

220

a smartphone (i.e., ZTE Blade C341), with the illuminance

221

set at 480 lx and the duration at 500 ms. The photopupillary

222

reflex was captured with a second smartphone, namely a

223

Samsung Galaxy a7 (2016) with a 13-megapixel camera,

224

although the final app integrates both functions and only

225

one smartphone without tripod support is needed for future

226

use. In fact, in the final app the video recording and the flash

227

are synchronized as follows: the flash starts 2 s after the

228

recording has started, lasts for 500 ms and the recording is

229

stopped when 9 s in total are reached. This allows a recording

230

of about 6–7 s of the pupil reaction, in line with the typical

231

duration of the event[44].

232

Both smartphones were selected depending on their

avail-233

ability at the times of the experiments and were placed on a

234

tripod at a distance of 8 cm from the subject’s face. In

litera-235

ture, distances in the range of 8–15 cm were used[20,45,46].

236

As the luminance will vary according to the distance from

237

the light source following an inverse-square law [47], the

238

luminance at various distances (i.e., 5–20 cm, with a 1-cm

239

step) was also evaluated to check whether small differences

240

in distance could significantly affect it. It resulted that in

241

the above-mentioned range, i.e., 11.5 ± 3.5 cm, the luminance

242

had an average percentage change of 10.7%. Furthermore, we Fig. 1 – Context-driven design: the methodologies used for

the design and evaluation of the proposed pupillometer.

1Registration number 500, approval title ‘‘Studio pilota sull’utilizzo della pupillometria cromatica per la diagnosi e il monitoraggio delle degenerazioni retiniche ereditarie”).

(4)

243 investigated whether the flashes at 8 cm and at 15 cm would

244 trigger a pupil reaction, and whether the minimum reached

245 by the pupil would be similar. This was tested with two

246 smartphone models available at the time of writing (i.e., a

247 Doogee S60 Lite and an iPhone 7). The pupil minimum,

248 expressed as the normalized pupil/iris ratio, resulted to be

249 varying in the range 71.2 ± 2.7% (percentage variation of

250 3.8%) with the Doogee S60 Lite flash, and in the range 50.7

251 ± 0.4 (percentage variation of 0.8%) with the iPhone 7. This

dif-252 ference is probably due to the more powerful flash embedded

253 in the second smartphone model. Therefore, assuming the

254 test will be run with the same device, the pupil contraction

255 will remain substantially the same in the recommended

dis-256 tance range.

257 The height at which the two smartphones were placed was

258 so that the eye resulted to be at the center of the frame. The

259 opposite eye was neither stimulated by the flash, nor covered.

260 Throughout these experiments the ambient light was

mea-261 sured using a luxometer (Dr. Meter, LX1010B), in order to

262 ensure an approximately constant baseline light intensity.

263 The videos were then fed to a dedicated algorithm as

264 detailed below. At this prototyping stage all the signal

elabo-265 rations are performed on Matlab.

266 In the future, the definitive algorithm will either run on a

267 dedicated server application that will include a Matlab

com-268 piled dynamic link library (DLL) to foster the accessibility of

269 older models of smartphones, or will be embedded in the

270 mobile device itself, for the most performing models.

271 2.2.2. Preprocessing

272 The images contained in the frames of the video are

prepro-273 cessed by turning them into gray-scale (seeFig. 2, IIa),

binariz-274 ing them according to a certain threshold, and going through

275 morphological opening (i.e., erosion followed by dilation) and

276 closing (i.e., dilation followed by erosion) to remove any dark

277 unrelated pixel or particularly small objects (seeFig. 2, IIb).

278

The binarization phase is a pivotal pre-step and can be

influ-279

enced by the overall light intensity of the frames: if the

inten-280

sity changes over the frames, the results may not be ideal. In

281

case the acquisition is performed with high intensity of light

282

(e.g., flashlight on), a higher threshold is selected, by

assess-283

ing the mean intensity of the first 3 frames (i.e., baseline)

284

and setting the highest threshold if the mean intensity of a

285

frame results 5% greater than the baseline. The values of

286

the two thresholds were determined empirically. The final

287

app records 9-s H.264 encoded-videos with a 30-fps framerate

288

and an average size of 11.5 MB.

289 2.2.3. Image processing

290

The image recognition algorithm consists of two main parts:

291

the pupil and the iris recognition. The reason behind the

292

choice of including the iris part is due to the contextualization

293

of the design and will be further explained in the results

sub-294 section 3.1named ‘‘Ethnography-driven user-need and

con-295

textual analysis”.

296

2.2.3.1. Pupil recognition algorithm. The algorithm (see

297 Fig. 2) starts by prompting a user input, i.e., the framing of

298

the part of interest (i.e., the eye). The user can draw a

rectan-299

gular box, superimposing it on the first frame of the video,

300

and the coordinates of such polygon will be utilized to crop

301

the frames used during the tuning step of the algorithm

302

(seeFig. 2, I). The latter consists in running the commands

303

for preparing the images (as described above) and for finding

304

the pupil and its center, only over the first three frames of the

305

video.

306

As regards the pupil recognition, three different

307

approaches were tested, namely blob-detection algorithm,

308

circular Hough transform, and watershed transform. These

309

methods were assessed computing the mean absolute error

310

(MAE) and the correlation with the manual measurement

311

(Pearson’s r). In particular, the blob-detection algorithm used

Fig. 2 – The flowchart illustrates the various steps of the preprocessing and image processing, with a specific focus on the recognition of the pupil.

(5)

312 in the comparison was a variation of the original that

con-313 sisted in removing the spikes and substituting them with

314 the average value of the points preceding and following the

315 spike.

316 The blob-detection algorithm was inspired by Barragan’s

317 [25]algorithm, which detects the ‘‘blob” with the greatest area

318 contained in the picture, circles it, and finds its center. The

319 diameter output of the tuning stage is called baseline

diame-320 ter, because it is then utilized to automatically calculate the

321 cropping frame dimensions for the main part of the

algo-322 rithm, by creating a framing square with a side equal to four

323 times the baseline diameter, centered on the pupil. The same

324 pupil recognition algorithm is then called again upon all the

325 newly-cropped frames and an array containing the

unpro-326 cessed diameter is saved (seeFig. 2, III).

327 2.2.3.2. Iris recognition algorithm. The algorithm (seeFig. 3)

328 starts by prompting a user input, i.e., the framing of the part

329 of interest (i.e., the iris). A circle is superimposed over the first

330 frame and the user can resize it according to the iris boundary

331 in the frame. Given the position of three points of such circle,

332 its equation is derived as well as the baseline radius of the

333 iris, which is then used as a parameter for the circular Hough

334 transform algorithm (seeFig. 3, II).

335 In this case, the same three algorithms tried out for the

336 pupil recognition were tested as well.

337 Before being fed to this algorithm, the frames are

prepro-338 cessed as described above (seeFig. 3, I, IIIa, IIIb). Moreover,

339 as the outcome of the application of the circular Hough

trans-340 form algorithm depends on the sensitivity, an extra

precau-341 tion is taken in this direction. In fact, although the initial

342 sensitivity is set to 0.88, if no circle is found during the first

343 run, the sensitivity is increased by 0.02 until a circle is found.

344 An array containing the unprocessed diameter is saved.

345

2.2.3.3. Postprocessing. The acquired pupil measurements

346

were often subject to artifacts such as blinking and image

347

overexposure due to flash. For this reason, a series of

func-348

tions were applied in order to smooth spikes and filter out

349

noise. In particular, the affected data were identified,

350

removed from the dataset and the gaps were filled using an

351

interpolation.

352

As regards the flash, the above-mentioned threshold for

353

the binarization was designed to tackle this problem.

How-354

ever, the method is not completely robust and can fail to

355

switch to the correct threshold in the first and final frames

356

of the flash, when it is not at full brightness. Consequently,

357

the flash-related frames were removed and substituted with

358

a linear interpolation. This would not affect the overall

perfor-359

mance as the first part of the constriction phase of the

pho-360

topupillary reaction is steep and approximately linear.

361

As regards the blinking artifact, it partially or completely

362

obstructs the pupil, making the algorithm track nothing or a

363

larger area (e.g., a shadow under the eyelid). Such sudden

364

change of the detected area is a good indicator of when the

365

pupil detection fails. Detrending and differentiation were

366

used to identify these sudden changes: the data points

367

affected by blinking are flagged when the local derivative

368

exceeds a threshold in magnitude. Also in this case, the

369

flagged points are removed and linear interpolation is used

370

to fill the gaps.

371

Finally, the ratio between the diameter of the pupil and

372

that of the iris is calculated and normalized to the initial

373

value. The values related to the frames preceding the pupil

374

reaction are individuated and substituted with 100% values.

375

The part of the array related to the pupil reaction is fitted with

376

a Gamma function, as suggested by Knapen et al. [48](see

377 Fig. 2, IV).

(6)

378 The algorithm also calculates some variables relevant to

379 pupillometry:

380

381  Pupil minimum, the minimum size reached by the pupil at

382 the end of the constriction phase; the constriction phase

383 was considered to start when the pupil/iris normalized

384 ratio fell under 98% of its original value;

385

386  Latency, the delay in the pupil response calculated as the

387 time between the start of the flash and the start of the

con-388 striction phase;

389

390  Max constriction velocity, the maximum rate of change in

391 the pupil diameter during the constriction phase;

392

393  Mean constriction velocity, the average rate of change of

394 the pupil diameter during the constriction phase;

395

396  Mean dilation velocity, the average rate of change of the

397 pupil during the dilation phase, which is contiguous to

398 the constriction phase and was considered to end when

399 the pupil/iris normalized ratio overtook 98% of its original

400 value;

401

402  T75, the time implied by the pupil to recover 75% of the

403 amplitude of the constriction starting from the peak of

404 the constriction;

405

406 2.2.4. System integration

407 The resulting pupillometry system has been designed as a

408 three-tier application:

409

410  Presentation layer: the Android app will be used for

acquir-411 ing the video samples and firing the flash.

412

413  Logic layer: a connection with the server code performing

414 the analyses will be developed. The app will act as client.

415 In case of high performance devices, both the client and

416 the server software will be running on the mobile device.

417

418  Data layer: our system will be linked to the database and

419 web application described in[15–7]through RESTful dialog

420 and dedicated APIs.

421

422 2.2.5. App design

423 The app was developed in Android Studio, using Java for the

424 implementation of functions and XML for the design of the

425 user interface, targeting Android-based smartphones with

426 an API level of at least 21 (i.e., Android 5.0 Lollipop), because

427 of the use of the ‘‘camera200 package. This choice allows

428 94.1% of Android users to use our app, as only 5.9% of the

429 Android-based smartphones have an API level lower than 21

430 worldwide (and similar trends can be found in Africa) (from

431 Android Platform/API version distribution – Android Studio)

432 [49].

433

The logo, representing an eye-shaped logarithmic spiral,

434

was hand-drawn and digitized using GIMP (GNU Image

435

Manipulation Program).

436

2.3. Technical validation

437 2.3.1. Video acquisition procedure and image processing

438

All the frames of one of the acquired videos were analyzed

439

both with the Matlab algorithm and manual measurements.

440

In particular, the frames were analyzed manually by two

441

independent authors that were blinded to the output of the

442

pupillometer, in order to reduce the risk of bias: for each

443

frame the diameters of the pupil and of the iris were

mea-444

sured twice and averaged in order to reduce the measurement

445

error.

446

Also in this case, the values related to the frames

preced-447

ing the flash were individuated and substituted with 100%

448

values. Consequently, Pearson’s r and the associated

p-449

value, the root mean square error (RMSE) and the MAE were

450

calculated for the Gamma-fitted signal and the raw

auto-451

mated signal, compared to the manual measurements.

More-452

over, the error rate was estimated by calculating the percent

453

error and counting how frequently it would go over a 10%

454

threshold.

455

2.3.2. Benchmarking

456

Our pupil tracking algorithm was also validated against the

457

output of an IR Pupillometer (DP-2000 – NeurOptics). The

458

gamma fit in this case was not needed because of the

non-459

interaction between the flash and the IR recording. The

tech-460

nical validation was done based on the output variables,

spec-461

ified above. In particular, the variables outputted by the IR

462

pupillometer were normalized in respect to the initially

mea-463

sured pupil size in order to make them comparable with those

464

resulting from our algorithm.

465

The RMSE and MAE were calculated for each variable and

466

for both the algorithms, comparing them with those coming

467

from the manual measurements, taken by two independent

468

and blinded authors. Consequently, 3 Bland-Altman plots

469

were generated for each of the 4 variables, after testing

470

whether their residuals were normally distributed with a

471

Shapiro-Wilk test[50](normality being a necessary condition

472

for such plots). The Bland-Altman plots compared the app

473

Algorithm and the IR Algorithm, the app algorithm and the

474

Manual measurements and the IR algorithm and the Manual

475

measurements.

476 2.3.3. Testing the safety of the flash

477

An experiment was set up to test the safety of use of a

smart-478

phone flash on the human eye, although the safety of the

pro-479

cedure has been confirmed by preliminary research[45]. The

480

smartphone was placed on a tripod and an operator held the

481

sensor of a luxometer (Dr. Meter, LX1010B) in front of the

482

camera at a distance of interest for pupillometry, i.e.,

483

8.5 cm. Firstly, the illuminance in this condition (i.e., ambient

484

light) was recorded; secondly the flash was turned on and the

485

illuminance in this condition was recorded. Hence, the

illumi-486

nance could be easily calculated and compared against the

487

(7)

488

3.

Results

489 3.1. Ethnography-driven user-need and contextual

490 analysis in LMICs

491 The contextual analysis highlighted that SSA countries have

492 [41,43]: extremely limited resources, an insufficient number

493 of healthcare professionals and of specialized doctors;

inade-494 quate hospital infrastructures, highly unstable main power

495 supply, poor transport infrastructure and supply-chain, and

496 an uneven distribution of the resources that are concentrated

497 in the capital to the detriment of remote areas. Nonetheless,

498 SSA can count on a very young population, a wide diffusion of

499 mobile phones, smartphones[52], and ICT literacy. There is a

500 wide diffusion of one dominant smartphone operative system

501 (i.e., 86.39% of smartphones based on Android)[53]and a good

502 coverage of wireless telecommunication. Prospectively, the

503 SSA market of medical devices is fast growing (the compound

504 annual growth rate is around 6%)[54]. The adoption of new

505 technologies meets limited inertia and healthcare operators

506 are resilient. In fact, working in challenging conditions

507 pushes workers to practice with the unpredictable conditions

508 and events, developing a great capability to react to, respond

509 to and recover from emergencies. Nonetheless, this positive

510 attitude comes with evident risks too. Often,

non-511 specialized personnel respond to the medical devices

mal-512 functioning with creative shortcuts, which tend to become

513 chronic solutions, prone to new risks, hindering the recovery

514 of the initial level of effectiveness and safety[55]. Finally, a

515 massive ‘‘brain drain” affects doctors and specialized doctors,

516 who move to other countries for better opportunities, further

517 depleting SSA health care systems[56].

518 The results of the contextual analysis have been discussed

519 with African scholars and healthcare personnel in Benin,

520 Ethiopia and South Africa, resulting in a series of

specifica-521 tions for the local manufacturing of a resilient pupillometer,

522 with its consumables and spare parts. The design should be

523 low-cost, based on free design and manufacturing processes,

524 it should empower non-specialized healthcare personnel and

525 providing clear guidance or affordances, possibly be

battery-526 based and resilient to the unstable power supply, resilient to

527 misuses, requiring no maintenance and easy to clean, and

528 based on Android smartphones, possibly compatible with

529 the high degrees of ingress protection (e.g., IP68) described

530 in IEC 60,529 and with rugged and military standards (e.g.,

531 MIL-STD-810G).

532 None of the pupillometers reviewed resulted sufficiently

533 resilient to LMICs. Existing smartphone solutions meet the

534 cost-requirement, but as it emerged from our study, this is

535 not the only criterion for being resilient in LMICs. For

exam-536 ple, most of the proposed solutions widely utilized

acces-537 sories and spare parts, including external LEDs, filters, and

538 lenses, which will hinder the lifetime of the device in SSA.

539 In fact, such parts would be difficult to retrieve, repair or

540 replace in LRSs[41,43].

541 Moreover, when deepening the design principles of a

542 pupillometer, two technical requirements emerged:

computa-543 tional capability compatible with an old Android smartphone;

544

use no accessories or only accessories that could be locally

545

manufactured (e.g., 3D printed).

546

This last criterion particularly influenced the design of the

547

app. The majority of existing pupillometers utilize visible

548

light to stimulate the pupil and infrared (IR) cameras to film

549

its constriction, in order to avoid artefacts. Most smartphones

550

do not contain IR cameras, therefore visible light was used

551

both to stimulate the pupil, using the phone flash, and to film

552

its reaction with the phone camera. As a consequence, the

553

video frames coinciding with the flash resulted overexposed

554

due to the sudden change of luminosity and the proximity

555

of the subject, requiring the adoption of a fitting algorithm

556

to recover the missing pupil diameter in those frames.

More-557

over, phone camera framerates are lower than the one of

558

many pupillometers. Thus, the proposed algorithm fitted

559

the acquired diameter data first with a linear fitting, in order

560

to recover missing data due to the flash, and then with a

561

Gamma distribution for approximating missing frames,

562

reconstructing the complete response of the pupil, as

pro-563

posed in[48]. The interpolation also reduced the blinking

arti-564

facts, affecting also standard pupillometry. Moreover, the

565

distance between the eye and the device created artifacts in

566

the estimation of the pupil diameter. These artifacts could

567

be limited with a recycled plastic 3D printed accessory clipped

568

on the mobile phone, aiming at keeping the eye to phone

dis-569

tance constant. However, since a 3D printer could be not

570

available, the proposed algorithm for the recognition of the

571

pupil reflex was based on the ratio between the diameter of

572

the pupil and that of the iris. In fact, while the pupil diameter

573

reacts to light, the iris does not. The ratio was normalized

574

with the value measured before the flash shooting to facilitate

575

the reading of the pupil diameter. The adoption of these

fea-576

tures required a specific technical validation of the final

algo-577

rithm and app.

578

3.2. Development of the smartphone-based pupillometer

579

A total of 4 videos were recorded, in which the eye was

stim-580

ulated 3 times in order to be sure to capture a good-quality

581

response (i.e., absence or reduced number of blinks).

582 3.2.1. Preprocessing and image processing

583

Three methods were tested for the pupil and iris detection: a

584

blob-detection algorithm, the circular Hough transform, and

585

the watershed transform. The blob-detection algorithm

out-586

performed the other methods with lower MAE (3.9% versus

587

4.55% of the Hough transform, and 21.25% of the Watershed

588

transform) and higher correlation (Pearson’s r of 0.95 and

p-589

value <0.00001 versus 0.84 and p-value p-value <0.00001 of

590

the Hough transform, and0.03 and p-value of 0.83), being

591

selected for the pupil tracking (Fig. 2, III). This choice also

592

avoided the introduction of an extra user input, i.e., the pupil

593

radius range, which is necessary for the Hough transform to

594

work. The Hough transform outperformed the other methods

595

in tracking the iris. Consequently, the Hough transform was

596

performed for the iris tracking (see Fig. 3, IV). Fig. 4shows

597

the comparison of three signals, namely the gamma-fitted

598

(8)

599 removal of the flash and blink artifacts via post-processing

600 are evident inFig. 4.

601 3.2.1.1. App design. The app, named Oida (meaning ‘‘I

602 have seen” and ‘‘I know”, from Ancient Greek ‘‘ὁqάx”), for

603 tracking the photopupillary reflex is being finalized. As of

604 now, the app comprises of a Main Activity, Instructions

Activ-605 ity and a Camera Activity. It is available in two languages:

606 English and French, both widespread languages in SSA.

607 3.3. Technical validation

608 3.3.1. Video acquisition and image processing

609 During the manual validation (seeFig. 4), the Gamma fitted

610 ratio resulted significantly highly correlated with the manual

611 measurement (Pearson’s r = 0.963, p-value < 0.0001 versus

612 Pearson’s r = 0.982 and p-value < 0.0001 of the raw automated

613 signal (app)), with a RMSE and a MAE of 3.20% and 2.24%,

614 respectively (versus 3.96% and 3.09% of the raw signal).

More-615 over, the error rate for the Gamma fitted ratio resulted to be

616 7.14%.

617 3.3.2. Benchmarking

618 Ten videos acquired by clinical ophthalmologists with the

IR-619 pupillometer on healthy subjects were analyzed with the

IR-620 pupillometer software and with the app algorithms in Matlab.

621 Fig. 5shows the pupil reaction over the frames, captured by

622 the three different algorithms. Resulting measures were

com-623 pared with those calculated by hand after annotating the

624 diameter of the pupil manually for each video-frame. The

625 MAE and RMSE demonstrated a significant improvement in

626 comparison with the software provided with the commercial

627 device, for all the variables, as reported inTable 1. The

agree-628 ment among the measurement methods, namely app

algo-629 rithms/IR method, app Algorithms/Manual method and IR

630 method/Manual method, was estimated with Bland-Altman

631 plots[57,58](plots are not reported for brevity, but are

avail-632 able upon request), following the Shapiro-Wilk test for

nor-633 mality[50].

634

All the differences imputed to the Bland-Altman plot

635

resulted normally distributed with a 0.842 < W < 1 [59]at a

636

95% confidence level.Table 2reports the 95% limits of

agree-637

ment for each variable (the lower the better). The agreement

638

between the app and the manual method outperformed the

639

other methods.

640 3.3.3. Testing the safety of the flash

641

The base illuminance (i.e., the one of the ambient) was

mea-642

sured at 200 lx; since the illuminance in the ‘‘flash on”-state

643

was 680 lx, the illuminance of the flash alone was 480 lx.

644

The comparison of this value to the above-mentioned ISO

645

standards ensured the safety of the procedure. In fact,

646

480 lx convert to 7.0310-5W/cm2under the hypothesis of an

647

average wavelength of 555 nm (the ISO standards set the

648

max allowed value to 0.706 W/cm2).

649

4.

Discussion

650

This paper presented the design and technical validation of

651

an app for the measurement of the pupillary reflex, intended

652

to be used in LRSs. Given the absence of specific regulations or

653

clear guidelines for the design of medical devices for LRSs, we Fig. 4 – A comparison of the signals: the manual

measurements (dashed gray line), the raw automated algorithm (dashed orange line) and the Gamma-fitted ratio (yellow line).

Fig. 5 – The normalized pupil diameter over the frames. Superimposed on the curve, 5 sample snapshots acquired during the app testing. Each snapshot comprises of two halves, highlighted by a yellow semi-circle: each left half represents the initial condition, and each right half represents the evolution of the response.

Table 1 – Values of the mean absolute error and root mean square error for the IR pupillometer and for our solution.

Parameter MAE RMSE

Pupil Minimum (%) IR 1.11 1.55

app 1.00 1.36

Max Constriction Velocity (%/s) IR 6.76 8.07

app 2.56 3.26

Mean Constriction Velocity (%/s) IR 2.85 3.70

app 0.47 0.70

Mean Dilation Velocity (%/s) IR 6.84 7.35

(9)

654 adopted the prescriptions of the European regulations on

655 medical devices, the relevant standards for designing

smart-656 phone applications, and the 5A principle of the WHO.

657 The first part of this paper illustrated how the local needs

658 and contextual analyses can be performed enriching

engi-659 neering design with ethnographic methods. The second part

660 presented and discussed the technical validation of the

soft-661 ware, which was performed in two steps: validation of the

662 acquisition and benchmarking of our app versus a

commer-663 cial IR-pupilometer assuming as gold-standard the

frame-664 to-frame manual annotation of pupillary video recordings

665 from 10 subjects. The very low errors and high correlation

666 resulting from the former validation confirmed that a

667 smartphone-based pupillometry acquisition without

acces-668 sories was viable. This concept was corroborated by the low

669 errors and narrow limits of agreement for the variables

result-670 ing from the second validation. The latter proved that the

pro-671 posed solution, despite being based on a simple app and a

672 smartphone in order to be sustainable in resource-scarce

set-673 tings, is able to perform just as well, and often better than the

674 benchmark. These results were possible due to the

interpola-675 tion algorithm and the normalization of the pupil diameter

676 with the iris one, which minimised artefacts due to hand

677 motions and the use of visible light for pupil stimulation via

678 mobile phone flash and video acquisition. In fact, commercial

679 pupilometers use IR for image acquisition, which is not

avail-680 able in the majority of smartphones. Indeed, the app

681 described achieved better results than the commercial IR

682 medical device.

683 Moreover, the comparison with existing literature

sug-684 gested that the proposed solution is the only one designed

685 for LMICs and rigorously validated. In 2013, Tae-hoon Kim

686 et al. [18] proposed a smartphone-based pupilometer that

687 works with an Android app and an add-on device, which

con-688 tains two types of LEDs and an IR filter. Their results showed

689 that their system could have been a good candidate for

pupil-690 lometry, however it had not been validated against a

CE-691 marked or FDA-cleared commercial pupilometer. Moreover,

692 the required accessories would make its use in LMICs

incon-693 venient. In 2017, Mariakakis et al.[19]proposed an

iPhone-694 based pupilometer that works with a box similar to the one

695

used for virtual reality headsets and makes use of

convolu-696

tional neural networks. The box was used to eliminate

ambi-697

ent light and control the distance to the person’s face.

698

Nonetheless, the authors themselves claimed that such a

699

box could be a hindrance in case of measuring the pupil light

700

reaction with an unconscious patient and for tracking the

701

whole reaction to the flash (i.e., the dilation phase cannot

702

be captured because of the lack of lighting). Their design, in

703

fact, only allowed assessing the pupil constriction phase

704

and seems to require a server connection in order to work.

705

In 2018, McAnany et al.[20]performed a study proving that

706

the iPhone camera could be used for this purpose, comparing

707

it with an IR camera, which was not medical rated. In 2019,

708

the start-up Brightlamp introduced an iPhone app for tracking

709

the photopupillary reflex based on trained object detectors

710

and on the use of no accessory. Such app was manually

vali-711

dated similarly to part two of our validation with no

bench-712

marking, resulting in a higher MAE (2.9%) and wider limits

713

of agreement for the pupil constriction (±14%, which

714

improved to ± 9% after bias correction). However, a recent

715

study by McKay et al.[60]benchmarked Brightlamp with a

716

portable IR pupillometer demonstrating that this particular

717

iPhone app has poor repeatability and is not practical tool

718

for supporting clinical decisions. Nonetheless, in general,

719

iPhone-based pupillometry, relying on Hough transform,

720

was proved to be possible and accurate enough by Neice

721

et al.[61].

722

Moreover, the use of iPhones in SSA is quite uncommon

723

due to their cost and because iPhone does not have any

724

rugged model. In 2019, Viga`rio, et al.[21]proposed a system

725

for the continuous monitoring of the pupil using a

smart-726

phone, the Virtoba support for mobile-phones and two LEDs.

727

However, the system does not provide the typical

pupillome-728

try stimulus (i.e., flash in the eye) and its validation was

lim-729

ited to physiological data found in literature (i.e., the reaction

730

of the pupils to a cold stress test). None of the designs above

731

were conceived for LRSs. As it emerged from our contextual

732

analysis, in fact, basing part of the design on extra add-on

733

parts can turn out to be counterproductive either in a possible

734

early health technology assessment phase or when already

735

on the market. Adding extra add-on devices will increase Table 2 – Values of agreement between methods are shown for each of the four variables.

Variable Limits of Agreement

Pupil minimum app-IR ±2.74%

app-Man ±1.99%

IR – Man ±3.03%

Max constriction velocity app-IR ±9.48%/s

app-Man ±6.33%/s

IR – Man ±11.74%/s

Mean constriction velocity app-IR ±7.42%/s

app-Man ±1.23%/s

IR – Man ±7.52%/s

Mean dilation velocity app-IR ±5.61%/s

app-Man ±0.20%/s

(10)

736 the need for spare parts that will probably not be available in

737 LRSs. For this reason, the authors of this paper suggest that

738 the ‘‘less is more” philosophy should be adopted when

start-739 ing considering additional parts of a device conceived for

740 these settings. Although the study was focused on

pupillom-741 etry, its findings on the design can be relevant for other

appli-742 cations. For instance, it emerged clearly that affordability is

743 not the only criteria for a device to be suitable for LRSs. Many

744 other issues should be considered during the design,

includ-745 ing affordance, easiness of deployment and use, resilience

746 to underlying infrastructures that could be not stable,

avail-747 ability of spare parts and consumables, and available

underly-748 ing technologies.

749 Another issue that emerged is the tendency to release

750 apps with healthcare ambitions without proper technical

val-751 idations (e.g., manual and/or benchmarking for apps). In the

752 past years, both the FDA and the European Commission

equa-753 ted medical software (including app) to medical devices,

mak-754 ing validation essential to guarantee safety and adequate

755 performance. For this reason, in this paper, we decided to

756 adopt the European perspective for CE marking medical apps,

757 in order to stress the importance of the technical validation

758 phase in the design cycle.

759

5.

Limitations

760 This study presents the preliminary results of the design and

761 technical validation of a smartphone app.

762 The results are valid and limited to one Android

smart-763 phone model; further testing could include more models. In

764 these further tests the different flashes of different

smart-765 phone models should be checked for safety against the

rele-766 vant standards.

767 The current design relies on a server connection, which

768 may be a bottleneck, although currently many remote areas

769 of LRSs (e.g., Africa) are served by good quality mobile phone

770 services. To overcome these limitations, future versions of the

771 app will also include the processing algorithms. To this

772 regard, also artificial intelligence may be explored. While this

773 solution may be difficult to run on very old smartphones, it

774 should run smoothly on the other models.

775 Furthermore, a possible bias in the feasibility study might

776 have been introduced because the opposite eye was not

cov-777 ered and could have potentially been partially stimulated by

778 the changes in the ambient light. However, the ambient light

779 was measured and maintained as constant as possible

780 throughout the experiment. To this regard, healthcare

work-781 ers will need to be instructed and cover the opposite eye in

782 order to avoid bias in the pupillary reactions.

783 Moreover, as of now, the app is not giving any result in

784 terms of millimetres; future versions may include this feature

785 only for the pupil size, as it would be redundant for the pupil/

786 iris ratio.

787 Finally, the performance of the app is currently evaluated

788 on light brown eyes, darker shades should be investigated,

789 as they may be more challenging for pupillometers relying

790 on visible light only. Future experiments could test the

appli-791 cation on subjects with three types of iris colour (i.e., fair,

792

medium, and dark). In this way, the efficiency of our

applica-793

tion on different iris colours could be evaluated. This could

794

also inform future upgrades of the app software to make it

795

more efficient.

796

6.

Conclusions

797

This paper presented the design and technical validation of a

798

mobile app aimed to perform smartphone-based

pupillome-799

try, suitable for use in LMICs. The performance of the app

800

algorithm is promising and, being able to compete with the

801

performance of the algorithm of a commercial IR

pupillome-802

ter medical device, suggests furthering the study with more

803

smartphone models and transitioning towards a dedicated

804

server application and/or a completely standalone app.

805

The performance of the algorithms of the app, as

con-806

firmed by the technical validation, are sound: the proposed

807

solution, by exploiting the pervasive presence of smartphones

808

in LMICs and by not requiring expensive settings or complex

809

procedures, represents a significant improvement towards an

810

extensive screening of eye pathologies and brain trauma

811

worldwide.

812

Declarations

813 Ethics approval and consent to participate

814

The ethnographic analysis was conducted in accordance with

815

the ethical approval REGO-2018-2283, obtained from the

816

Biomedical and Scientific Research Ethics Committee.

817

The videos used for the technical validation were acquired

818

from 11 healthy subjects in accordance with the ethical

819

approval obtained from the Ethical Committee of University

820

of Campania Luigi Vanvitelli (Registration number 500,

821

approval title ‘‘Studio pilota sull’utilizzo della pupillometria

cro-822

matica per la diagnosi e il monitoraggio delle degenerazioni retiniche

823

ereditarie”).

824 Consent for publication

825

Not applicable.

826 Availability of data and materials

827

All the data and materials are available upon reasonable

828

request.

829 Competing interests

830

The authors declare no competing interest.

831 Funding

832

D. Piaggio and L. Pecchia received funding from the University

833

of Warwickwith two Warwick Impact Found grants supported Q3

834

by the EPSRC Impact Accelerator Award (EP/K503848/1 and EP/

835

(11)

836

CRediT authorship contribution statement

837 Davide Piaggio: Methodology, Formal analysis, Investigation,

838 Visualization. Georgy Namm: Formal analysis, Investigation,

839 Data curation, Visualization. Paolo Melillo: Methodology,

840 Investigation, Resources. Francesca Simonelli: Methodology,

841 Resources. Ernesto Iadanza: Conceptualization, Methodology,

842 Resources, Supervision. Leandro Pecchia: Conceptualization,

843 Methodology, Investigation, Resources, Visualization,

Super-844 vision, Project administration, Funding acquisition.

845

Acknowledgement

846 None.

8 4 7

R E F E R E N C E S

Q 4

848 [1]Hall JE. Guyton and Hall textbook of medical physiology e-849 Book. Elsevier Health Sciences; 2015.

850 [2]Adoni A, McNett M. The pupillary response in traumatic 851 brain injury: a guide for trauma nurses. J Trauma Nursing 852 2007;14:191–6.

853 [3] Chen JW, Gombart ZJ, Rogers S, Gardiner SK, Cecil S, Bullock

854 RM. Pupillary reactivity as an early indicator of increased

855 intracranial pressure: The introduction of the Neurological

856 Pupil index. Surg Neurol Int. 2011;2.

857 [4]Wildemeersch D, Peeters N, Saldien V, Vercauteren M, Hans 858 G. Pain assessment by pupil dilation reflex in response to 859 noxious stimulation in anaesthetized adults. Acta 860 Anaesthesiol Scand 2018;62:1050–6.

861 [5]Wildemeersch D, Baeten M, Peeters N, Saldien V, Vercauteren 862 M, Hans G. Pupillary dilation reflex and pupillary pain index 863 evaluation during general anaesthesia: a pilot study. 864 Romanian J Anaesthesia Intensive Care 2018;25:19–23.

865 [6]Bryant CD. Handbook of death and dying. Sage; 2003.

866 [7]Hall CA, Chilcott RP. Eyeing up the future of the pupillary light 867 reflex in neurodiagnostics. Diagnostics 2018;8:19.

868 [8]Sharpe LG, Pickworth WB, Martin WR. Actions of 869 amphetamine and antagonists on pupil diameter in the 870 chronic sympathectomized dog. Psychopharmacology 871 1977;53:115–20.

872 [9]Rollins MD, Feiner JR, Lee JM, Shah S, Larson M. Pupillary 873 effects of high-dose opioid quantified with infrared 874 pupillometry. Anesthesiology 2014;121:1037–44.

875 [10]Park JC, Chen Y-F, Blair NP, Chau FY, Lim JI, Leiderman YI, 876 et al. Pupillary responses in non-proliferative diabetic 877 retinopathy. Sci Rep 2017;7:44987.

878 [11]Liew B, Zainab K, Cecilia A, Zarina Y, Clement T. Early 879 management of head injury in adults in primary care. 880 Malaysian Family Physician 2017;12:22.

881 [12]Moppett IK. Traumatic brain injury: assessment, 882 resuscitation and early management. Br J Anaesth 883 2007;99:18–31.

884 [13] Health NIf, Excellence C. Head injury: assessment and early

885 management. London: NICE Guideline (CG176). 2014.

886 [14] Melillo P, de Benedictis A, Villani E, Ferraro MC, Iadanza E,

887 Gherardelli M, et al. Toward a novel medical device based on

888 chromatic pupillometry for screening and monitoring of

889 inherited ocular disease: A pilot study. World Congress on

890 Medical Physics and Biomedical Engineering 2018: Springer;

891 2019. p. 387-90.

892 [15]Iadanza E, Fabbri R, Luschi A, Gavazzi F, Melillo P, Simonelli F, 893 et al. ORA´O: RESTful cloud-based ophthalmologic medical

894 record for chromatic pupillometry. In: International

895 Conference on Medical and Biological Engineering. Springer;

896 2019. p. 713–20.

897

[16] Iadanza E, Fabbri R, Luschi A, Melillo P, Simonelli F. A

898 collaborative RESTful cloud-based tool for management of

899 chromatic pupillometry in a clinical trial. Health Technol

900 2020;10:25–38.

901

[17] Iadanza E, Goretti F, Sorelli M, Melillo P, Pecchia L, Simonelli F, 902 et al. Automatic detection of genetic diseases in pediatric age

903 using pupillometry. IEEE Access 2020;8:34949–61.

904

[18] Kim T-H, Youn J-IJJotOSoK. Development of a

Smartphone-905

based Pupillometer. 2013;17:249-54.

906

[19] Mariakakis A, Baudin J, Whitmire E, Mehta V, Banks MA, Law

907

A, et al. Pupilscreen: Using smartphones to assess traumatic

908

brain injury. 2017;1:81.

909

[20] McAnany JJ, Smith BM, Garland A, Kagen SLJO, Science V.

910

iPhone-based pupillometry: a novel approach for assessing

911

the pupillary light reflex. 2018;95:953.

912

[21] Viga´rio R, Santos M, Vences R, Quinta˜o C. Towards a mobile 913 phone pupillometer. In: International Conference on Applied

914 Human Factors and Ergonomics. Springer; 2019. p. 194–206.

915

[22] Sluss K, Rao R, Schuler J. Evaluation of Reflex and its

916

Applications in Medicine.

917

[23] Melillo P, Pecchia L, Testa F, Rossi S, Bennett J, Simonelli F.

918 Pupillometric analysis for assessment of gene therapy in

919 Leber Congenital Amaurosis patients. Biomed Eng Online

920 2012;11:40.

921

[24] Li D, Winfield D, Parkhurst DJ. Starburst: A hybrid algorithm

922

for video-based eye tracking combining feature-based and

923

model-based approaches. 2005 IEEE Computer Society

924

Conference on Computer Vision and Pattern Recognition

925

(CVPR’05)-Workshops: IEEE; 2005. p. 79-.

926

[25] Barragan D. Tracking pupil using image processing. Available

927

at: https://it.mathworks.com/matlabcentral/fileexchange/

928

49599-tracking-pupil-using-image-processing Last access:

929

22/02/2021.

930

[26] Ferone A, Frucci M, Petrosino A, di Baja GS. Iris detection

931 through watershed segmentation. In: International Workshop

932 on Biometric Authentication. Springer; 2014. p. 57–65.

933

[27] Jarjes AA, Wang K, Mohamme GJ. GVF snake-based method 934 for accurate pupil contour detection. Inf Technol J

935 2010;9:1653–8.

936

[28] Minaee S, Abdolrashidi A. Deepiris: Iris recognition using a

937

deep learning approach. arXiv preprint arXiv:190709380.

938

2019.

939

[29] Saidi H, Mutiso BK, Ogengo JJJotm, outcomes. Mortality after

940

road traffic crashes in a system with limited trauma data

941

capability. 2014;8:4.

942

[30] WHO. The top 20 causes of death. Available at: https://www.

943

who.int/news-room/fact-sheets/detail/the-top-10-causes-of-944

death Last access: 22/02/2021.

945

[31] UN. Transforming our world: the 2030 Agenda for Sustainable

946

Development. Available at: https://sustainabledevelopment.

947

un.org/post2015/transformingourworld Last access: 22/02/

948

2021.

949

[32] Couret D, Boumaza D, Grisotto C, Triglia T, Pellegrini L,

950

Ocquidant P, et al. Reliability of standard pupillometry

951

practice in neurocritical care: an observational,

double-952

blinded study. Crit Care 2016;20.https://doi.org/10.1186/

953 s13054-016-1239-z.

954

[33] Shah MT, Joshipura M, Singleton J, LaBarre P, Desai H, Sharma 955 E, et al. Assessment of the availability of technology for

956 trauma care in India. World J Surg 2015;39:363–72.

957

[34] Fetterman DM. Ethnography (Vol. 17). Thousand Oaks, Ca.

958

1998.

959

[35] Jaffray DA. World congress on medical physics and

960

biomedical engineering, june 7-12, 2015, Toronto, Canada.

961

(12)

962 [36]Pecchia L. Assessment of Medical Devices in low income 963 settings. Third WHO Global Forum on Medical Devices. 964 Geneva, 2017.

965 [37] Eskola H, Va¨isa¨nen O, Viik J, Hyttinen J. EMBEC & NBC 2017:

966 Joint Conference of the European Medical and Biological

967 Engineering Conference (EMBEC) and the Nordic-Baltic

968 Conference on Biomedical Engineering and Medical Physics

969 (NBC), Tampere, Finland, June 2017: Springer; 2017.

970 [38] Lhotska L, Sukupova L, Lackovic´ I, Ibbott GS. World Congress

971 on Medical Physics and Biomedical Engineering 2018: June

3-972 8, 2018, Prague, Czech Republic: Springer; 2018.

973 [39] WHO. Fourth WHO Global Forum on Medical Devices.

974 Geneva2019.

975 [40]Pecchia L, Pallikarakis N, Magjarevic R, Iadanza E. Health 976 Technology Assessment and Biomedical Engineering: Global 977 trends, gaps and opportunities. Med Eng Phys 2019;72:19–26.

978 [41]Piaggio D, Medenou D, Houessouvo RC, Pecchia L. Donation of 979 medical devices in low-income countries: preliminary results 980 from field studies. In: International Conference on Medical 981 and Biological Engineering. Springer; 2019. p. 423–7.

982 [42] Piaggio D, Namm G, Pecchia L. Vol. 2 (2019): Special Issue 2:

983 3rd ICEHTMC Proceedings: Global Clinical Engineering

984 Journal. ICEHTMC. Rome: GlobalCE; 2019.

985 [43]Di Pietro L, Piaggio D, Oronti I, Maccaro A, Houessouvo RC, 986 Medenou D, et al. A framework for assessing healthcare 987 facilities in low-resource settings: field studies in Benin and 988 Uganda. J Med Biol Eng 2020;40:526–34.

989 [44] Ciuffreda KJ, Joshi NR, Truong JQ. Understanding the effects

990 of mild traumatic brain injury on the pupillary light reflex.

991 Concussion 2017;2:CNC36.

https://doi.org/10.2217/cnc-2016-992 0029.

993 [45]Calandra DM, Di Martino S, Riccio D, Visconti A. Smartphone 994 based pupillometry: an empirical evaluation of accuracy and 995 safety. In: International Conference on Image Analysis and 996 Processing. Springer; 2017. p. 433–43.

997 [46]Shin YD, Bae JH, Kwon EJ, Kim HT, Lee TS, Choi YJ. 998 Assessment of pupillary light reflex using a smartphone 999 application. Exp Ther Med 2016;12:720–4.

1000 [47] Voudoukis N, Oikonomidis S. Inverse square law for light and

1001 radiation: a unifying educational approach. Eur J Eng Technol

1002 Res 2017;2:23.https://doi.org/10.24018/ejers.2017.2.11.517.

1003 [48] Knapen T, de Gee JW, Brascamp J, Nuiten S, Hoppenbrouwers

1004 S, Theeuwes JJPO. Cognitive and ocular factors jointly

1005

determine pupil responses under equiluminance. 2016;11:

1006

e0155574.

1007

[49] Statcounter. Mobile & Tablet Android Version Market Share

1008

Worldwide. Available at:

https://gs.statcounter.com/android-1009

version-market-share/mobile-tablet/worldwide Last access:

1010

22/02/2021.

1011

[50] Shapiro SS, Wilk MBJB. An analysis of variance test for

1012

normality (complete samples). 1965;52:591-611.

1013

[51] ISO. International standard ISO 15004-2, Ophthalmic

1014

instruments - Fundamental requirements and test methods,

1015

Part 2, Light hazard protection. 2007.

1016

[52]De Maria C, Di Pietro L, Ravizza A, Lantada AD, Ahluwalia AD. 1017 Open-source medical devices: healthcare solutions for low-,

1018 middle-, and high-resource settings. In: Clinical Engineering

1019 Handbook. Elsevier; 2020. p. 7–14.

1020

[53] Statista. Market share of mobile operating systems in Africa

1021

from January 2018 to July 2020. Available at: https://www.

1022

statista.com/statistics/1045247/share-of-mobile-operating-1023

systems-in-africa-by-month/ Last access: 22/02/2021.

1024

[54] Statista. Size of the medical device market in Africa in 2017

1025

and a forecast for 2023. Available at: https://www.

1026

statista.com/statistics/890876/medical-device-market-size-1027

africa/ Last access: 22/02/2021.

1028

[55]Borsci S, Kuljis J, Barnett J, Pecchia L. Beyond the user

1029 preferences: Aligning the prototype design to the users’

1030 expectations. Hum Factors Ergon Manuf Serv Ind

1031 2016;26:16–39.

1032

[56] Firsing S. How severe is Africa’s brain drain? Africa at LSE.

1033

2016.

1034

[57] Bland JM, Altman DJTl. Statistical methods for assessing

1035

agreement between two methods of clinical measurement.

1036

1986;327:307-10.

1037

[58] Giavarina DJBmBm. Understanding bland altman analysis.

1038

2015;25:141-51.

1039

[59] USEPA. Monitoring Guidance for Determining the

1040

Effectiveness of Nonpoint Source Controls. 1997.

1041

[60]McKay RE, Kohn MA, Schwartz ES, Larson MD. Evaluation of 1042 two portable pupillometers to assess clinical utility.

1043 Concussion 2020;5:CNC82.

1044

[61]Neice AE, Fowler C, Jaffe RA, Brock-Utne JG. Feasibility study 1045 of a smartphone pupillometer and evaluation of its accuracy.

1046 J Clin Monit Comput 2020;1–9.

Riferimenti

Documenti correlati

Si ritiene pertanto che il Sistema del Design Italiano si trovi di fronte nuove strade da intra- prendere per apportare innovazione formale al prodotto interpretando i nuovi

Altered balance between protein synthesis and degradation leading to intracellular accumulation of misfolded protein aggregates and failure of clearance mechanisms are

34 Intitolato Wilde, Strauss e la cultura europea, il convegno — organizzato dal Centro Studi sul Teatro Musicale dell’Università degli Studi di Torino col sostegno di Arianna per

Nel corso della sperimentazione i gruppi di studenti hanno realizzato una mappa del territorio di Venaria Reale per restituire le informazioni raccolte durante lo studio

Asserragliato nei propri pregiudizi, Mosca non comprende che a parte l’impiego diverso (non necessariamente più giusto o sbagliato) di alcune parole come «rituale», Grotowski e

Tubulointerstitial involvement has been described as tubulointerstitial-dominant HIV-related diseases or secondary to antiretroviral agents that may cause direct renal

Although the chemical composition of the meat was not significantly affected by the dietary treatment, the saturated fatty acid and monounsaturated fatty acid proportion