MoLE was based on a need to provide training and education where the challenge is the inability to train and communicate due to long-standing challenges of low-bandwidth and limited internet connectivity and infrastructure. Even though the participating organizations JKO and Telemedicine and Advanced Technology Research Center (TATRC), viewed the project from different perspectives, they believed research was needed to show that under unpredictable situations (i.e., full connectivity, low connectivity and no connectivity), m-Learning could address the connectivity challenge.
Venkat Sastry talking on MoLE ProjectA Global Technology Initiative
1. Dr. Venkat V S S Sastry,
Jacob Hodges
ONRG Grant N62909-10-1-7121 13 November 2014 1
21 countries
Azerbaijan
Bulgaria
Canada
Chile
France
Germany
Georgia
Jordan
Mexico
Norway
Poland
Peru
Romania
Serbia
South Africa
Singapore
switzerland
UAE
UK
Ukraine
US
2. ONRG Grant N62909-10-1-7121 13 November 2014
2
• MoLE Project was a two year initiative sponsored by
Coalition Warfare Program involving 24 countries
• Based on requirement by the Commander,
US Naval Forces Europe/Naval Forces Africa/Sixth Fleet
Deputy Director for Training and Readiness to
effectively operate in the largest maritime area of
operations where the most difficult
challenge is the ability to train and communicate
3. MoLE would leverage the global cellular
network infrastructure. Mobile
technologies and emerging mobile
applications/service models to build an m-
Learning capability
ONRG Grant N62909-10-1-7121 13 November 2014
3
8. ONRG Grant N62909-10-1-7121 13 November 2014
8
Technology & Transition WG
Blending “task” and
“learning” Initially, content
grouped by moment of
need
9. Mission Tools
A collection of interactive job aids and performance
support tools
Learning
The Learning section focuses on informal and formal
training experiences.
Place holder for
future content
Library
The Library contains a multimedia collection of indexed
materials in different media (text, eBook, video) covering
topics relevant to the organization.
Mission Packs
Mission packs allows download of new content
Standards
The Standards section contains critical references
Network
Crowd-sourcing information on local resources and key
contacts.
10. Compelling m-learning is
not just about the content.
Or the technology. It is also
about good learning design,
good usability, and an app
that you want to use
Most work based m-learning
happens in stolen moments,
which is why a smooth blend
between learning,
performance-support and
productivity-tools is so key
“
”
11. 11
Converting and originating new content for MSO mobile learning
11
12. Task
Locate the pre-deployment
checklist and check off at
least 3 items for your
personal preparations.
Add one new item that you
might need, and save
13. 14
Usability review October 2011
Proof of Concept trials April to June 2012 using “baked in” evaluation
layer
14. All content needed significant
“mobilization”
Blend of tools + courses worked well
Benefits of strong UX / user interface
Cross platform = reuse (content still in use)
Rich media / video very effective
Similar responses regardless of age,
device type, nationality, background
15. Testing & Evaluation Group
Data collection plan
• Demographic data
• Formulate research questions
Compliance with Research Ethics
Data analysis and interpretation
16. ONRG Grant N62909-10-1-7121 13 November 2014
17
The Testing & Evaluation Process adhered to the
research protocol guidelines established by the
Ethics
1. US Department of Defense Directive
(DODD) 5400.11 (DOD Privacy Program,
DODD 3216.2)
2. (Protection of Human Subjects in DoD-Supported
Research, SECNAVINST
3900.39D, 32CFR219 (Common Rule),
3. European Union (EU) Data Protection
Requirements
4. UK Ministry of Defence JSP 536 (Research
Ethics Committee).
17. ONRG Grant N62909-10-1-7121 13 November 2014
18
All the investigators need
to complete Research
Ethics courses online
18. Testing & Evaluation Group
(a) What is the effectiveness or
practicality of using mobile
technologies to provide
training?
All Users Responses
60
50
40
30
20
10
0
0
1
2
3
4
5
6
7
19
19. (b) What is the benefit or availability
of using mobile devices in providing
training?
Testing & Evaluation Group
20
20. (c) Do persons who have used the
mobile device provided training
believe they are capable to
perform a desired outcome?
Testing & Evaluation Group
21
21. (d) What is the degree to which a
mobile training application is
available in austere environments to
as many Role players as possible?
Testing & Evaluation Group
22
22. EVALUATION QUESTIONS Category
1.1 Rate your confidence in using the checklist. Self-efficacy
2.1 How useful are mobile devices for this type of training? Utility
2.2 How easy was the CTIP content to navigate? Accessibility
3.1 How useful is access to digital reference materials on a mobile device? Utility
4.1 How easily could you explain to a colleague how to look up NGO logos? Self-efficacy
5.1 How useful are videos as a tool to enhance your understanding of USAID?
Usefulness
6.1 How helpful are collaboratively updated contact details? Self-efficacy
7.1 How useful are mobile devices for training? Utility
7.2 How useful are mobile devices for refresher training? Utility
7.3 How useful are mobile videos to enhance understanding? Utility
7.4 How useful are digital reference materials on mobile devices? Utility
7.5 How useful are collaborative contact pages to access real time information?
Utility
7.6 How easy was the Global MedAid app to navigate? Accessibility
7.7
Please write up to five single words which best describe your overall
experience of using the mobile device as a tool for learning.
7.8 What did you like most about the mobile device as a tool for learning?
7.9 What did you like least about the mobile device as a tool for learning?
8 Any additional comments
23
28. ONRG Grant N62909-10-1-7121 13 November 2014
29
0 – no response
1 – not confident
at all
2 – not confident
3 – not very
confident
4 – neither
confident nor
unconfident
5 – quite
confident
6 – confident
7 – very
confident
29. 30
• A “Welcome Email” was initially sent to
approximately 600 volunteer on the first
day of the MoLE Proof of Concept
• The announcement email provided
additional information on installation, the
user guide and an introduction video.
30. 31
• Launched: 252
• Started Proof of Concept: 166 (65.9%)
• Completed Proof of Concept: 126 (50%)
31. Question Responses
Age less than 20, 20-29, 30-39,
40-49, 50+, no answer
Gender Male, Female
How proficient are you in English Beginner . . . Advanced
32
Are you using your own personal smartphone for the purpose of
this trial?
Yes, No, No Answer
How comfortable are you with using the mobile device that's
running the MoLE app? [Beginner to Advanced]
Beginner . . . Advanced
Have you previously been involved in humanitarian assistance or
disaster relief operations?
Yes, No, No Answer
What is your professional expertise? Medical, Rescue, Training,
E-learning, Other
Have you taken the Trafficking in Persons (CTIP ) course within
the last two years?
Yes, No, No Answer
32. 33
Age Group Breakdown
< 20 years 5 1.87%
20+ years 51 19.03%
30+ years 83 30.97%
40+ years 75 27.99%
50+ years 52 19.40%
No Response 2 0.75%
42. Android Users
Responses
25
20
15
10
5
iPhone Users
Responses
40
30
20
10
Likert Rating Question All-Users Android iPhone
N/A or Skipped 8 3 5
1 Not useful at all 9 1 8
2 Not useful 9 4 5
3 Not very useful 20 14 6
4 Neither useful nor not useful 18 6 12
5 Quite useful 33 15 18
6 Useful 60 23 37
7 Very useful 22 5 17
43
All Users Responses
60
50
40
30
20
10
0
0
1
n: 179
Minimum: 0
Maximum: 7
Mean: 4.7 +/-0.3
Median 5.4
Mode: 6
Standard Deviation: 1.9
Variance: 3.5
2
3
4
5
6
7
0
0
1
2
3
4
5
6
7
0
0
1
2
3
4
5
6
7
43. Android Users
Responses
Skipped or
30
25
20
15
10
iPhone Users
Responses
Skipped or
40
30
20
10
Likert Rating Question All-Users Android iPhone
N/A or Skipped 18 5 13
1 Not useful at all 2 0 2
2 Not useful 4 2 2
3 Not very useful 6 2 4
4 Neither useful nor not useful 9 2 7
5 Quite useful 42 17 25
6 Useful 47 26 21
7 Very useful 51 17 34
44
All Users Responses
Skipped or
60
50
40
30
20
10
0
N/A
1
2
3
4
7
5
6
n: 179
Minimum: 0
Maximum: 7
Mean: 5.1 +/-0.3
Median 6.0
Mode: 7
Standard Deviation: 2.1
Variance: 4.5
0 5
N/A
1
2
3
4
5
6
7
0
N/A
1
2
3
4
5
6
7
44. ONRG Grant N62909-10-1-7121 13 November 2014
45
A majority of the Android
participants were “confident “and
iPhone users were
“very confident” in using the
checklist to support pre-deployment
activities.
45. Android Users
Responses
Skipped or
25
20
15
10
5
iPhone Users
Responses
Skipped or
40
30
20
10
Likert Rating Question All-Users Android iPhone
N/A or Skipped 25 9 16
1 Not useful at all 1 0 1
2 Not useful 2 0 2
3 Not very useful 3 0 3
4 Neither useful nor not useful 9 2 7
5 Quite useful 35 19 16
6 Useful 45 17 28
7 Very useful 59 24 35
46
All Users Responses
Skipped or
60
50
40
30
20
10
0
N/A
1
2
3
4
7
5
6
n: 179
Minimum: 1
Maximum: 7
Mean: 5.1 +/-0.3
Median 6.0
Mode: 7
Standard Deviation: 2.3
Variance: 5.4
0
N/A
1
2
3
4
5
6
7
0
N/A
1
2
3
4
5
6
7
46. iPhone Users
Responses
Skipped or
40
30
20
10
Android Users
Responses
Skipped or
30
25
20
15
10
Likert Rating Question All-Users Android iPhone
N/A or Skipped 24 9 15
1 Not useful at all 0 0 0
2 Not useful 5 1 4
3 Not very useful 9 1 8
4 Neither useful nor not useful 17 5 12
5 Quite useful 27 15 12
6 Useful 54 30 24
7 Very useful 43 10 33
47
All Users Responses
Skipped or
60
50
40
30
20
10
0
N/A
1
2
3
4
7
5
6
n: 179
Minimum: 2
Maximum: 7.0
Mean: 4.8 +/-0.3
Median 6.0
Mode: 6
Standard Deviation: 2.3
Variance: 5.2
0
N/A
1
2
3
4
5
6
7
0 5
N/A
1
2
3
4
5
6
7
47. iPhone Users
Responses
Skipped or
30
25
20
15
10
5
Android Users
Responses
Skipped or
30
25
20
15
10
Likert Rating Question All-Users Android iPhone
N/A or Skipped 30 10 20
1 Not useful at all 1 1 0
2 Not useful 5 0 5
3 Not very useful 16 6 10
4 Neither useful nor not useful 14 4 10
5 Quite useful 29 13 16
6 Useful 46 27 19
7 Very useful 38 10 28
48
All Users Responses
Skipped or
50
40
30
20
10
0
N/A
1
2
3
4
7
5
6
n: 179
Minimum: 1
Maximum: 7
Mean: 4.5 +/-0.4
Median 5.0
Mode: 6
Standard Deviation: 2.4
Variance: 5.8
0
N/A
1
2
3
4
5
6
7
0 5
N/A
1
2
3
4
5
6
7
48. iPhone Users
Responses
Skipped or
40
30
20
10
Android Users
Responses
Skipped or
25
20
15
10
5
Likert Rating Question All-Users Android iPhone
N/A or Skipped 43 20 23
1 Not useful at all 0 0 0
2 Not useful 0 0 0
3 Not very useful 7 1 6
4 Neither useful nor not useful 11 5 6
5 Quite useful 22 15 7
6 Useful 54 21 33
7 Very useful 42 9 33
49
All Users Responses
Skipped or
60
50
40
30
20
10
0
N/A
1
2
3
4
7
5
6
n: 179
Minimum: 3
Maximum: 7
Mean: 4.4 +/-0.4
Median 6.0
Mode: 6
Standard Deviation: 2.7
Variance: 7.2
0
N/A
1
2
3
4
5
6
7
0
N/A
1
2
3
4
5
6
7
49. Android Users
Responses
20
15
10
5
iPhone Users
Responses
40
30
20
10
Likert Rating Question All-Users Android iPhone
N/A or Skipped 14 4 10
1 Not useful at all 2 1 1
2 Not useful 11 4 7
3 Not very useful 27 16 11
4 Neither useful nor not useful 23 6 17
5 Quite useful 26 14 12
6 Useful 38 20 18
7 Very useful 38 6 32
50
All Users Responses
40
30
20
10
0
0
1
n: 179
Minimum: 0
Maximum: 7
Mean: 4.4 +/-0.3
Median 5.0
Mode: 6, 7
Standard Deviation: 2.0
Variance: 4.2
2
3
4
5
6
7
0
0
1
2
3
4
5
6
7
0
0
1
2
3
4
5
6
7
50. Android Users
Responses
30
25
20
15
10
5
iPhone Users
Responses
30
25
20
15
10
5
Likert Rating Question All-Users Android iPhone
N/A or Skipped 5 1 4
1 Not useful at all 5 2 3
2 Not useful 22 8 14
3 Not very useful 6 3 3
4 Neither useful nor not useful 18 3 15
5 Quite useful 37 16 21
6 Useful 58 29 29
7 Very useful 28 9 19
51
All Users Responses
60
50
40
30
20
10
0
0
1
n: 179
Minimum: 0
Maximum: 7
Mean: 4.8 +/-0.3
Median 5.3
Mode: 6
Standard Deviation: 1.8
Variance: 3.4
2
3
4
5
6
7
0
0
1
2
3
4
5
6
7
0
0
1
2
3
4
5
6
7
‘They can because they
think they can’
51. iPhone Users
Responses
Skipped or
40
30
20
10
Android Users
Responses
Skipped or
25
20
15
10
5
Likert Rating Question All-Users Android iPhone
N/A or Skipped 33 12 21
1 Not useful at all 0 0 0
2 Not useful 2 0 2
3 Not very useful 7 3 4
4 Neither useful nor not useful 11 3 8
5 Quite useful 28 15 13
6 Useful 45 17 28
7 Very useful 53 21 32
52
All Users Responses
Skipped or
60
50
40
30
20
10
0
N/A
1
2
3
4
7
5
6
n: 179
Minimum: 2
Maximum: 7
Mean: 4.7 +/-0.4
Median 6.0
Mode: 7
Standard Deviation: 2.5
Variance: 6.3
0
N/A
1
2
3
4
5
6
7
0
N/A
1
2
3
4
5
6
7
56. A mobile application framework that
consisted of (a) a mobile applications (for
iOS and Android mobile operating
systems) for the storage and display of
mobile learning content range of open
formats; and (b) an application
programming interface (API) describing
how interactive content (produced in
HTML) may interact with the application.
ONRG Grant N62909-10-1-7121 13 November 2014
57
57. A web-based content management system
for the storage and management of
learning content, plus the associated
standards for 'packaging' and meta-data
describing learning content.
ONRG Grant N62909-10-1-7121 13 November 2014
58
58. ONRG Grant N62909-10-1-7121 13 November 2014
59
A 'mobile evaluation layer' for surveying
user feedback through the mobile
applications during the period of research
by (a) presenting users with contextually-relevant
questions during use of the mobile
applications; and
59. collecting and collating evaluation data in
a central repository, including techniques
for associating multiple evaluation
responses from single user and allowing
evaluations to be undertaken 'offline'
(without cellular or Wi-Fi internet access)
for asynchronous transfer to the repository.
ONRG Grant N62909-10-1-7121 13 November 2014
60
60. All of the Mobile Learning Environment
(MoLE) source code for the iOS and Android
Apps and the online content repository has
been uploaded to Open Source Mobile
Learning Environment (OMLET) and is
available, free of charge, at
https://wss.apan.org/1539/JKO/mole/SitePa
ges/Home.aspx.
ONRG Grant N62909-10-1-7121 13 November 2014
61
61. Mobile devices are practical and effective
Screen size is very important
Productivity tools enhance learning
experience
Mobile training applications are
appropriate in challenging environments
ONRG Grant N62909-10-1-7121 13 November 2014
62
62. Book
The “Mobile Learning Environment (MoLE) Project: A Global
Technology Initiative”, is about a two-year training and education
project that was sponsored by the U.S. Department of Defense
(DoD)’s Coalition Warfare Program (CWP) and other research-related
organizations that focused on providing training/education in
areas with low-bandwidth and limited internet connectivity and
infrastructure.
This book is available in paperback and e-reader version via
Amazon. The paperback is also available via CreateSpace
(https://www.createspace.com/4174484)
It will be of interest to:
• Learning and Development Professionals
• Educators and Teachers
• E-Learning and m-Learning Professionals
• Evaluators
• Government and Non-Governmental Organizations (NGO)
• Information Technology (IT) Professionals
• Medical Professionals
• Quality Management Professionals
• Research, Development, Testing & Evaluation (RDT&E)
Professionals
• Scientist and Technology (S&T) Professionals
A blog has been created http://wp.me/p2ZlwE-4o for anyone
interested in reading the Preface.
A brief overview of the MoLE Project via Slideshare
(http://www.slideshare.net/jrhodges1972/tracking/new#)
64. Learning is strongly influenced by
motivational factors and context
Measurement of learning is hotly debated
the extent to which technology enhanced
instruction facilitates learning adds to this
debate
ONRG Grant N62909-10-1-7121 13 November 2014
65
65. Plan – 80 minutes
Play Video to begin with (3 mins)
Background to project; sponsors
Early designs (from Tribal slides)
Example content 15 mins
• Medical
• MATTs
• Evaluation
Challenges 10 mins
• Technological
• Cultural; expectations; linguistic
• Ethical compliance; IRBs
Data Collection Plan 15 mins
• Informed consent
• Questionnaire
ONRG Grant N62909-10-1-7121 13 November 2014
66
• Data Analysis 15 mins
• Demographic data
• Exploratory data analysis
• Analysis of text; word co-occurrence
• Final Report
• Major outputs
• Include Jake’s booklet on
Amazon
• Concluding Remarks
• Offer Live Demo
• Quiz NGO etc
66. 67
• A “Welcome Reminder Email”
was initially bi-weekly to remind
the volunteers.
• The reminder email also
provided additional information
on installation, the user guide
and an introduction video.
78. As a learning platform
As a collaborative platform
As a problem solving tool
As a data acquisition device
As a computational resource (e.g. run
simulations)
As a communication device
ONRG Grant N62909-10-1-7121 13 November 2014
79
79. an evaluation strategy must be cognizant
of desired learning outcomes for the
intended learner, and should take into
account the combination of instructional
strategies that include delivery media
ONRG Grant N62909-10-1-7121 13 November 2014
80
80. learning effectiveness is a function of
effective pedagogical practices
smartphone technologies facilitate
constructive engagement with the learner
that in turn influences the pedagogic
practice
ONRG Grant N62909-10-1-7121 13 November 2014
81
81. The internal and cognitive information
processing events include temporary
storage, encoding and retrieval.
The external instructional events include
gaining attention, drill and practice, and
feedback
ONRG Grant N62909-10-1-7121 13 November 2014
82
82. Individuals who proactively approach their
learning tasks
• Personal initiative, perseverance, adaptive skills
Self-regulated processes help learners
acquire knowledge and skills more
effectively
Self-regulation of learning has been
suggested to refer to self-directed
processes that help learners learn more
effectively
ONRG Grant N62909-10-1-7121 13 November 2014
83
83. Item Instrument; Remarks
Background knowledge,
Conduct a pre-test
prior skills, abilities
Self-regulation in learning
context(Toering, Elferink-
Gemser, Jonker, Van
Heuvelen, & Visscher,
2012)
Use of tools that support self-regulation; Collect Self-regulation of Learning and
Self-Report Scale prior and post learning intervention. The online form is available
at
https://docs.google.com/spreadsheet/viewform?formkey=dHFwNnBiRjY1QWM0eE5
ielA5X0pVdXc6MQ.
Learning content Analysis of learning content provided for the learner;
Learning support tools Use of auxiliary support tools online, on the device; usage by the learner;
measurement of learner engagement; learner analytics
Learning transfer Conduct post-tests, apply knowledge to specific tasks that map to learning
objectives; these tests include, quizzes, case studies, written assignments and peer
review
Learner feedback Qualitative feedback and textual analysis
Learning style/preferences Learning modality preference inventory and Kiersey Temperament Inventory. It is
anticipated that the learning style has second order effects on learning. (see also
(Neuhauser, 2002)). Learning style may have stronger influence on learner
engagement and attention span.
Time on task Record of tasks completed and the use of tools. This data may be collected on the
device (aka MoLE).
Instructional method Types used and the nature of the tasks
Teacher effects Controlled experiment
ONRG Grant N62909-10-1-7121 13 November 2014
84
Notas do Editor
Duke Boutewell
Dr. Cynthia Barrigan and her team from DMRTI have contributed extensively to some of the top notch content.
Some of the video clips are not for faint at heart.
Developing content, repurposing content issues with content – the potential range of users, the lack of collaboration from partners
Visual Analogue Scale
After you have entered your pin you will be taken to the home screen of the MSO app.
We are proposing redesigning this screen to be more like a “dashboard”, and adding a new button to launch the evaluation
See Mgt_meet_Tribal_Eval_layer.pptx
Usability review with RAF Halton – what we discovered from talking to users
Proof of Concept research
On line demographic
User ID protected
Evaluation within the app – survey of different parts of the app, user guided through and invited to respond (no response option always available)
32 CFR 219 – Title 32 Code of Federal Regulation
(b) Unless otherwise required by department or agency heads, research activities in which the only
involvement of human subjects will be in one or more of the following categories are exempt from this
policy:
(1) Research conducted in established or commonly accepted educational settings, involving normal
educational practices, such as (i) research on regular and special education instructional strategies, or (ii)
research on the effectiveness of or the comparison among instructional techniques, curricula, or
classroom management methods.
(2) Research involving the use of educational tests (cognitive, diagnostic, aptitude, achievement), survey
procedures, interview procedures or observation of public behavior, unless:
(i) Information obtained is recorded in such a manner that human subjects can be identified, directly or
through identifiers linked to the subjects; and
(ii) Any disclosure of the human subjects' responses outside the research could reasonably place the
subjects at risk of criminal or civil liability or be damaging to the subjects' financial standing,
employability, or reputation.
17 questions
Due to storage limitations, videos are accessed online from a server.
Scenario 1, involved demonstrating the utility and self-efficacy of using
m-learning approaches to support traveling to a disaster site. Participants were directed to the
“Mission Tools” category and asked to review the “Packing Checklist”. Each participant was
expected to select three items from the packing list and add an additional item they thought was
missing. After completing the Packing Checklist, participants were asked to take a pre-deployment
training course in order to evaluate the use to mobile technologies as a training medium. For this
training task, participants were asked to take the Combatting Trafficking in Persons (CTIP) pre-test,
post-test and Module 1.