WHENVIRTUALREALITYMEETSIOTINTHEGYM:ENABLINGIMMERSIVEANDINTERACTIVEMACHINEEXERCISEByMdFazlayRabbiATHESISSubmittedtoMichiganStateUniversityinpartialentoftherequirementsforthedegreeofElectricalEngineering-MasterofScience2016ABSTRACTWHENVIRTUALREALITYMEETSIOTINTHEGYM:ENABLINGIMMERSIVEANDINTERACTIVEMACHINEEXERCISEByMdFazlayRabbiTheadventofheadmounteddisplays(HMDs)suchasOculusRift,SamsungGearVR,andMicrosoftHoloLensareturningimmersivevirtualreality(VR)intoreality.Asoneofitsmostcompellingapplications,weenvisionthatVRwillrevolutionizethepersonalexperienceinourdailylives.Towardsthisvision,wepresentJARVIS,anovelimmersiveexercisetrackingapplicationsenabledbyminiatureIoTsensingdevicescombinedwithamobileHMDdevice.ByattachingIoTsensingdevicesonanygymmachines,JARVIScon-tinuouslytracksexerciseprogressandassessesexercisequalityinreal-time,enablingeeinteractionwithmachines.Byconvertingthecapturedexerciseprogressandqualityinfor-mationtoVRinputs,itcreatesanimmersiveexerciseexperiencewithavirtualavatartoguidemachineexercises.Wehaveconductedextensiveexperimentstovalidatetheperfor-manceofJARVIS.Itachieves97.96%ofrepetitionsegmentationaccuracywithoutknowingthecurrentexercisetype,aswellas99.08%ofexercisetyperecognitionaccuracyin1.22repetitionsonaverage.Wehavealsoconductedareal-worlddeploymentstudytoexaminetheoftheproposedplatform.Byanalyzingmuscleactivitiesusinghighyclin-icalsurfaceElectromyography(sEMG)sensors,ourresultsindicatethatourVR/IoT-basedplatformcouldprovideanengagingandiveguidancetoexercisersfortheirstrengthtraining.CopyrightbyMDFAZLAYRABBI2016Tomymom(FatemaKhatun)anddad(AnsarAliMia)fortheirunconditionalloveandsupportthroughoutmyeducation.ivACKNOWLEDGMENTSIwouldliketotakethisopportunitytoexpressmysincereappreciationtomyadvisor,Dr.MiZhang,forhiscontinuoussupport,help,patienceandencouragementthroughoutmymasterstudy.IwouldalsoliketothankDr.TaiwooParkfromtheDepartmentofMediaandInformationforhishelp,supportandmotivationthroughoutmythesiswork.ThisthesiswouldnothavebeenpossiblewithouttheguidanceofDr.MiZhangandDr.TaiwooPark.IamdeeplygratefultoDr.YoungkiLeefromSingaporeManagementUniversityforhismotivationandinsightfulcommentsaboutthiswork.Mysinceregratitudealsogoestomycommitteemember,Dr.DanielMorris.Dr.Morriswasverygenerouswithhistime,ideasandassistance.SpecialthanksgotomycolleaguesintheUbiquitousandMobileSensingLaboratorywhohavesupportedmeinmyresearchworkthroughreviews,discussionsandmeetings.Iamthankfultomyfriends(innoparticularorder)SayeedHyder,ShamsReaz,SaifMuhammadImran,SheikhMohammadShavik,ProteekChandanRoy,AnikandKamranAlifortheirsupportandmakingmylifewonderfulhereinEastLansing.Lastbutnotleast,Iamdeeplygratefultomyfamilyfortheirtremendoussupportandencouragement.vTABLEOFCONTENTSLISTOFTABLES....................................viiiLISTOFFIGURES...................................ixChapter1Introduction...............................1Chapter2Background................................52.1MotivationandDesignObjectives........................52.1.1MotivatingScenario...........................52.1.2DesignObjectives.............................62.1.3TargetedMachineExercises.......................9Chapter3RelatedWork..............................10Chapter4JARVISSystemOverview......................124.1VRSynthesisEngine...............................144.1.1AutomaticSceneManagement......................144.1.1.1MuscleHighlighting......................144.1.1.2SmartVisualQualityGuide..................144.1.2Real-timeVirtualBodyAnimation...................154.1.3InformativeHead-UpDisplay(HUD)..................164.2Real-timeExerciseAnalyticsEngine......................164.2.1DataAcquisitionandPreprocessing...................164.2.2ExerciseProgressTracking........................174.2.2.1RepetitionSegmentationandCounting............174.2.2.2MotionProgressDetection...................194.2.3ExerciseTypeRecognition........................204.2.3.1SensorLocation.........................214.2.3.2FeatureSelection........................214.2.3.3Session-wiseVoting.......................214.2.4ExerciseQualityAssessment.......................224.2.4.1LocalFeatureExtraction...................234.2.4.2MotionTrajectoryComparison................24Chapter5SystemEvaluation...........................285.1ExperimentalSetup................................285.2PerformanceofRepetitionSegmentationandCounting............295.2.1EvaluationMetrics............................295.2.2EvaluationResultsofRepetitionSegmentation.............305.2.3EvaluationResultsofRepetitionCounting...............30vi5.3PerformanceofExerciseRecognition......................315.3.1ImpactofSensorLocationandVoting.................315.3.2PerformanceofSubjectIndependentModel..............325.3.3ImpactofNumberofFeatures......................335.4ValidityofQualityAssessment..........................345.5SystemPerformance...............................355.5.1ProcessingTime.............................365.5.2EnergyConsumption...........................365.5.2.1VRHMDEnergyConsumption................365.5.2.2SensorTagEnergyConsumption................385.5.3DeviceTemperatureandVisualQuality................38Chapter6CaseStudywithJARVIS.......................416.1StudyMethods..................................426.1.1ParticipantsandInstrumentation....................426.1.2StudyDesignandAnalysis........................426.2Results.......................................446.2.1EMGactivity...............................446.2.2Userexperience:Questionnairesurvey.................456.2.3Userexperience:Interviewanalysis...................466.2.3.1VirtualSelfanditsMovement.................466.2.3.2ofReal-timeHUD..................476.2.3.3ofMuscleHighlighter.................476.2.3.4PointsofImprovementsandDesignSuggestions.......47Chapter7ConclusionandFutureWorks....................49BIBLIOGRAPHY....................................51viiLISTOFTABLESTable4.1:Listoffeaturesformachineexerciserecognition...............22Table5.1:Performanceofrepetitionsegmentation....................30Table5.2:Overallprecision,recall,accuracyandF-measurefortsensorloca-tionsandvotingoption............................32Table5.3:Confusionmatrixofexercisetyperecognition.................33Table5.4:Processingtimestatistics...........................36Table5.5:Energyconsumptionstatistics........................38viiiLISTOFFIGURESFigure1.1:Exampleusage:(a)ausertryingJARVIS,(b)stereoscopicimmersiveVRscreenofvirtualavatar............................2Figure2.1:The12targetmachineexercises.......................8Figure4.1:ThesystemarchitectureofJARVIS.....................12Figure4.2:Exampleofthreetqualityoptions.................15Figure4.3:TheminiatureSensorTagdeviceandacustomized3Dprintedcasewithanembeddedmagnet.............................17Figure4.4:Theillustrationoftheprincipleoftherepetitionsegmentationalgorithm.(a)and(c):3-axisaccelerometerdataandthecorrespondingaccelerationmagnitudesignalofthreerepetitionsofPulldown(a)andSeatedAbs(c).(b)and(d):theprincipalcomponent(PC)extractedfromthethreeaxesoftheaccelerometersignalofPulldown(b)andSeatedAbs(d)...19Figure4.5:(a)and(b):themotiontrajectoriesofonerepetitionofLegExtensioninfeaturespaceofAI(a)andVI(b)withsimilarityscore98.21%.(c)and(d):themotiontrajectoriesofonerepetitionofBicepsCurlinfeaturespaceofAI(c)andVI(d)withsimilarityscore16.97%..........26Figure5.1:Theillustrationofdatacollectionsetupandsensortagplacementusinglateralraisemachine(left)andseatedabsmachines(right).Circled1and2indicatetheplacementoftwosensortagsoneachmachine........29Figure5.2:Performanceofsubjectindependentmodel.................33Figure5.3:Accuracychangesovernumberoffeatures..................34Figure5.4:Validityofexercisequalityassessment....................35Figure5.5:EnergyconsumptionmeasurementsettingwithSamsungGalaxyS7,GearVRandMonsoonPowerMonitor.......................37Figure5.6:CPUtemperaturechangesovertime.High-Qualityoverheatedthephoneafter15min..................................39Figure5.7:Ambienttemperaturechangesovertime...................39ixFigure6.1:IllustrationofsEMGsensorplacementatupper/lowerrectusabdominis(URA/LRA),andexternal/internalobliqueabdominis(EOA/IOA)...43Figure6.2:WithoutVRvs.WithVRintermsoftheroot-mean-square(RMS)valuesofsurfaceEMGsignals............................44Figure6.3:Meannormalizedelectormyographyvaluesforfourmusclegroups.With-VRtlytfromWithout-VRatobliqueabdominis(**p<:01;*p<:05)..................................45Figure6.4:IMIsurveyresultsforthreecategories.With-VRtlytfromWithout-VR(*p<:05forallthreecategories)............46xChapter1IntroductionThepromiseofimmersivevirtualreality(VR)isstartingtolookveryrealwiththeemer-genceofheadmounteddisplays(HMDs)suchasOculusRift[5],SamsungGearVR[6],andMicrosoftHoloLens[3].ImmersiveVRsoaksauserinacomputer-generatedsimulatedenvi-ronmentthatchangesnaturallythroughthemovementsoftheuser.Anengagedimmersivevirtualexperienceisthusrealizedbyemployingsensingtechnologiesthatcapturetheuser'smovementsandusethoseinformationtoupdatethesensorystimulipresentedtotheuserviaaHMDtocreateanillusionofbeingimmersedinavirtualenvironmentinwhichtheycaninteract[27].Givenitsuniquecapabilityofenablingsuchengagedimmersivevirtualexperience,immersiveVRhasbeenregardedasatechnologythathasthetpotentialtorevolutionizeawiderangeofindustriessuchasentertainment,education,andhealthcare.Inthisthesis,weproposeaninnovativeimmersiveandinteractiveexerciseassistant,namedJARVIS,enabledbysynergisticadoptionoftwoemergingtechnologies,i.e,Head-mountedVRdevicesandminatureIoTsensors.Today,workoutinthegymhasbecomeanimportantpartofpeople'smodernlifestyle[28].However,workingoutonthestationaryexercisemachinesinthegymquicklymakeexercisersfeeleasilybored[15,18].Moreover,noviceexercisersarehardlyawareiftheyareusingtherightsetofmuscles,iftheirspeedsofexercisesareadequate,etc.withouthelpofprofessionaltrainers.Thispreventsexercis-1Figure1.1:Exampleusage:(a)ausertryingJARVIS,(b)stereoscopicimmersiveVRscreenofvirtualavatar.ersfrommakingsteadyprogress,andeventuallymakesexerciserslosetheirinterestsandmotivation.JARVISistheexerciseassistantsystemtoenableimmersiveandinterac-tivegymexerciseexperience.Inanutshell,JARVISsensesexercisetypesandmovementsofaexerciserusingaminiatureportablesensordevice,assessesexercisequalityandprogressinreal-time,andprovidetherich,immersivefeedbackthroughaVRheadset.Forexample,itshowsauseravirtualavatarscenewithhervirtualbody,indicatingproperwayofexercise,alongwithrichreal-timeexerciseprogressandqualityinformation.Inthisway,JARVISaimsatservingavirtualavatartoguidemachineexercisesinahighlyinteractivemannerandalsocreatesatrulyimmersivegymexerciseexperiences.AnumberofuniquetechnicallyandusabilitychallengesarisestofullyrealizethevisionofJARVIS.Firstofall,itneedstosensethetypeofexercisesandmovementsofexercisersinarealtime.Sincemostofthegymsarenotyetinstrumentedwithsensingcapability,weaimedatachievingthisbyasingletinyportablesensorthattheexercisercarries.Second,JARVISneedtomeetverytightperformancerequirementsintermsoflatencyandaccuracy.2Lowlatencyandhighaccuracyareveryimportanttoenableusableandhighlyimmersiveexperiences.Third,weneedtocreatesmoothVRsceneswithoutmakingthehead-mountedVRdeviceheatedtoomuch.Exerciserseasilyfeelhotwhileliftingthemachines,andevenmoderateheatcausedbytheVRdevicecouldeasilyhinderpleasantexerciseexperience.Finally,weneedtodesignausable,immersive,andeVRavataruserinterface.JARVISaddressesthesechallengesinseveralways.First,itemploystheconceptofmachine-wearables,thatis,temporarilyattachingatinysensorIoTtagonexercisemachines,therebyobtainingcleansensorsignalwithoutpre-installedinfrastructure.Second,weacombinationoftaglocationsshowingthebestexercisetyperecognitionaccuracy,aswellasutilizingrepetitivenatureofexercisetofurtherincreasetheaccuracy.Third,weproposeasmartguideforUIdeveloperstoadjustbetweenvisualqualityandcomputationload.Lastly,wedesignanimmersiveandevirtually-reconstructeduserinterface,byfullyutilizingthepotentialofVRHMDdevice.WehaveshowntheenessofJARVISthroughextensiveexperimentsonitstech-nicalcomponentsaswellasstudiesonitsusability.OurresultsshowthatJARVIScouldquicklyandaccuratelysegmentrepetitions(97.96%)anddetectexercisetype(99.08%in1.22repetitionsonaverage),withatinyIoTdevice.ItssmartassistforUIdevelopersalsohelpsinpreventingtheVRHMDdevicefromoverheating.Ourcasestudycon-ductedover10exercisersshowstheofthevirtualavatarenabledbythesystem.JARVISshowedtinenjoyment,perceivedcompetence,andusefulness,comparedtoatypicalexercisesetting,andtheparticipantsgenerallylikedtheVRuserinterfacecomponents.Inaddition,wequantitativelyanalyzemuscleactivitiesofexercisersusingtyclinicalsurfaceElectromyography(sEMG)sensors,andourresultsindi-catethatvirtualmusclehighlightingprovidesat(p<:01;p<:05)in3muscleactivation,comparedtoatypicalexercisetraining,implyingthepotentialofeeguidanceforstrengthtraining.4Chapter2Background2.1MotivationandDesignObjectives2.1.1MotivatingScenarioDavidhasbeenthinkingofbuildinghismusclesusingstationaryworkoutmachinesinthegymforrecentcoupleyears.However,hefailedtoadheretoregularstrengthtrainingschedules.Heonceworkedwithpersonaltrainers,buttheycouldnotbewithhimallthetime.Mostoftheexistingmobileloggingappsrequiredhimtomanuallylogexerciserecords.Althoughthereareafewautomaticmonitoringapps,theycouldonlysummarizetheexercisesaftertheexercisesession,whichwasnotveryhelpfulforhimduringthetraining.DavidbeginstotryJARVISexercisetrackingplatform.HebringsamobileVRHMDandasmallsensortagtoanygym.HewearstheHMDandputsthesensortagonabicepscurlmachine.Afterafewnumberoftrialrepetitions,JARVISdetectsthetypeofthemachineandquicklydisplaysavirtualtrainerscene,designedforthebicepscurl.JARVISalsoprovidesimmersiveandinteractiveVRenvironmentshowinga\virtualDavid"followinghisexercisemovements,andprovidesreal-timeexerciseprogressandqualityinformation.Moreover,JARVIShighlightstherequiredmusclegroupscorrespondingtothecurrentexercisebeingperformedandhispersonalgoalofstrengthtraining,whichhelpshimeasilygivemorefocus5onthemusclegroups.OnceDavidbicepscurl,JARVISturnsthevirtualexercisesceneandswitchesintooutsidepass-throughcamera,toletDavidnavigatetothenextmachine.WithJARVIS,Davidcangotothegymwithautomatically-generatedtrainingplans,andanautomatedreal-timeimmersiveexercisetrackingapplicationwhichmakeshimmoretinhisstrengthtraining.2.1.2DesignObjectivesBasedonthemotivatingscenario,thedesignofJARVISaimstoachievethefollowinggoals:FullyInteractiveandImmersiveExerciseExperience:JARVISisdesignedtocreatevirtualrealitymachineexercisetrainingscenewithavirtualavatartoprovideusersanimmersiveandinteractivemachineexerciseexperience.Oneofthekeyrolesofavirtualavataristointeractwithauserbyprovidingguidanceontheperformedexerciseinrealtime.Toachievethisgoal,JARVISprovidesasceneinVRHMD,showingavirtualbodyoftheuser,followingtheuser'smovement,tointuitivelyguideherexercises.Italsoaimstoprovideuser'sexerciseprogressandqualityinformationinrealtime.ComprehensiveMachineExerciseAnalytics:JARVISisdesignedtoprovidecompre-hensiveanalyticsofthemachineexercisestousers.Existingworkinexercisesensingsystemsfocusondetectingexercisesessions,countingexerciserepetitionsaswellasrecognizingexercisetypes[21,9].Althoughrepetitionsandtypesareusefulfortrack-ingexerciseprogress,providingfeedbacktousersaboutthequalityoftheirexercisesismoreimportanttonoviceandintermediateusers.Assuch,JARVISaimstopro-videacomprehensivemachineexerciseanalyticsbytrackingtheexerciseprogressand6quantifyingthequalityoftheperformedmachineexercises.WearableforMachines:JARVISisdesignedtobe\wearable"forexercisemachines.Existingworkusesmartphonesandwearabledevicessuchassmartwatchesandarm-bands(i.e.,human-wearables)totrackfreeweightandbodyweightexercises[21,22].However,inthedomainofmachineexercises,usingsensingdevicestemporarilyat-tachabletomachines(i.e.,machine-wearables)hastwotadvantagesoverusinghuman-wearabledevices.First,machine-wearablescancaptureabdominalandlowerlimbmachineexercisesthathuman-wearablesfailtocapture.Second,machine-wearablesonlycapturemachine'sconstrainedmovements,therebyprovidingmuchcleanermotiondatathanhuman-wearables.Incontrast,human-wearablescapturebodymovementnoisesaswellasnon-exercisebodymovementsbetweenexerciseses-sions,whichrequirestsignalprocessingtorout.JARVISaimstoleveragetheseadvantagestopreciselyandaccuratelymonitoranddetectexercisecontextinformation.UniversalSensingPlatform:JARVISisdesignedtoprovideauniversalsensingplat-formthatcanbeeventuallyusedforanyexercisemachineinaplugandplaymanner.Pioneerworkinpoweringexercisemachineswithsensingcapabilitiesexploredcus-tomizedinstrumentationofexercisemachinesbyintegratingttypesofsensingdevicesintotexercisemachines[25].Thisapproachrequiresconsiderableef-fortsfrommachinemanufacturersinmodifyingexercisemachines,leadingtotincreaseofcostsoftheexercisemachines.Incontrast,JARVISaimstoreducethebur-denofmachinemanufacturersbydevelopingauniformsensingdevicethatcanprovidesensingcapabilitytoanyexercisemachinewithoutcustomizedmocation.Weenvi-7sionthatinthefuture,everyexercisemachinewillhaveastandardizedslot/interfacefortheuniformsensingdevicetoplugin.Theexerciserwillplugoutthedeviceaftertheexercise.Assuch,JARVISactsasapersonalplatformthattracksindividual'sexercisesonanymachine.Tothebestofourknowledge,noexistingexercisesensingsystemsmeetalloftheabovedesigngoals.ThismotivatesustodesignJARVIStothiscriticalgap.Figure2.1:The12targetmachineexercises.82.1.3TargetedMachineExercisesInthiswork,wetargettwelvemachineexercisesrecommendedbytheresistancetrainingguideforhealthyadultsfromtheAmericanCollegeofSportsMedicine(ACSM)[8].Theseexercisesrepresentthemostcommonmachineexercisesthattargettmusclegroupsonthebody.Eachexerciseusesadedicatedmachinetotrainthespmusclegroup.Figure2.1illustratesthe12targetedmachineexercisesandcorrespondingmusclegroups.9Chapter3RelatedWorkImmersiveVRhasundergoneatransitioninthepastfewyearsthathastakenitoutoftherealmofexpensivetoyandintothatoffunctionaltechnology[10].Asoneofitsmostimportantapplications,weenvisionthatimmersiveVRwillfundamentallychangepeople'sexerciseexperiences.Inthepastdecade,theacademiaandindustryhaveachievedtsuccessindevel-opingwearableandmobilesensingsystemsfortrackingaerobicexercisessuchaswalking,joggingandrunning.Recently,researchfocushasshiftedtosensinganaerobicexercises(i.e.,musclestrengthtrainingexercises).Ingeneral,themusclestrengthtrainingexercisescanbegroupedintothreecategories:freeweightexercises,bodyweightexercises,andmachineex-ercises.Mostoftheexistingworkfocusedonsensingfreeweightandbodyweightexercises.Forexample,in[9],Changetal.demonstratedthefeasibilityofusingtwoaccelerometersbywearingoneonthehandandtheotheronthewaisttotrackfreeweightexercises.In[21],Morrisetal.developedRecoFitthatusedasingleinertialsensorwornontherightforearmofanindividualtomonitorbothfreeweightaswellasbodyweightexercises.In[23],Muehlbaueretal.leveragedtheaccelerometerandgyroscopeinsideasmartphoneandworethesmartphoneonthearmtotrackupperbodyexercises.ThefundamentalbetweentheseworksandJARVISisthattheyusesmartphonesorcustom-madesensingdeviceswornonthehumanbodytomonitorfreeweightandbodyweightexercises.In10comparison,ourworkusesminiatureIoT-basedsensingdevices\worn"onthestationarygymmachinestotrackmachineexercises.Finally,in[11],Dingetal.developedFEMO,aRFID-basedsensingsystemforfreeweightexercisemonitoring.OurworkissimilartoFEMOinthesensethatwebothattachsensorsontoexerciseinstruments(FEMOattachesRFIDstodumbbells).However,sinceFEMOusesRFsignalstotrackdumbbellmovements,itsperformancefromtheinterferencescausedbythemovementsofotherexercisersinthegym.Incontrast,JARVISusesaccelerometerandgyroscopetocapturemachinemovementscausedbytheexerciser'exercise-relatedbodymotions,whicharenotinterferedbythemovementsofotherexercisers.Moreimportantly,JARVISleveragesimmersiveVRtocreatecontrollable3Dstimulusenvironmentsaswellasanengagedvirtualavatartoguideexercisersinahighlyinteractivewaythatwasnotpreviouslypossibleusingexistingapproaches.11Chapter4JARVISSystemOverviewFigure4.1illustratesthesystemarchitectureofJARVIS.Asillustrated,JARVISconsistsoftwodevices:aminiatureIoTsensingdevicethatisattachabletogymmachinestotrackma-chineexercises;andaVRHMDthatprocessesthesensordataandvisualizesthecomputer-generatedsimulatedenvironmentaswellastheexerciseinformationtoanexerciserviatheHMDUIinrealtime.Figure4.1:ThesystemarchitectureofJARVIS.AsthecoreofJARVIS,theReal-timeExerciseAnalyticsEngineandtheVRSynthesisEngineruninsidetheVRHMDatthebackendandthefrontendrespectively.Atthebackend,Real-timeExerciseAnalyticsEngineretrievesthesensordatafromtheIoTsensingdeviceandanalyzesthesensordata.Sp,Real-timeExerciseAnalyticsEngineconsistsofthreemajorcomponents:ExerciseProgressTracker,ExerciseTypeRec-ognizer,andExerciseQualityAssessor.TheroleofExerciseProgressTrackeristosegmentthestreamingsensordataintoindividualexerciserepetitions(RepetitionSegmentor),count12therepetitionnumbers(RepetitionCounter),andtracktheprogressofexercisewithineachrepetition(MotionProgressDetector).Giventhesegmentedrepetitions,ExerciseTypeRec-ognizerdetectstheexercisetypeofeachrepetitionwhileExerciseQualityAssessorprovidesaquantitativeevaluationontheperformedexerciserepetitionbycomparingitwiththeguidemodelsfromprofessionaltrainers.Atthefrontend,VRSynthesisEnginesynthesizestheimmersivecomputer-generatedsimulatedgymexerciseenvironmentwithavirtualbodyoftheexerciser.Italsoprovidesreal-timeexerciseanalyticsbasedontheexerciseinformationfromthebackendReal-timeExerciseAnalyticsEngine.Sp,VRSynthesisEngineconsistsofthreemajorcomponents:VRSceneManager,ExerciseInformationVisualizer,andVirtualBodyAnimator.TheroleofVRSceneManageristoinitiateavirtualexercisescenecorrespondingtothecurrentexercisetype,recognizedbyExerciseTypeRecognizerinthebackend,whilehighlightingtargetmusclegroupbasedonexerciseandpersonalInaddition,itprovidesvisualqualityandcomputationloadsummarytoUIdevelopers.ExerciseInformationVisualizergeneratesahead-updisplayinavirtualspaceanddeliversexerciseprogressandqualityinformationtoauserinrealtime.VirtualBodyAnimatormakesavirtualbodytomovefollowingtheuser'smovement,basedonthereal-timeprogressinformationprovidedbyMotionProgressDetector.Inthefollowingtwosections,wedescribetheVRSynthesisEngineandtheReal-timeExerciseAnalyticsEngineindetails.134.1VRSynthesisEngine4.1.1AutomaticSceneManagementOnceauserbeginsexercising,VRSceneManagerautomaticallyinitiatesavirtualexercisescene,basedontheexercisetyperecognizedbytheExerciseTypeRecognizer.Figure4.1(ontherightside)showsanexamplesceneforaseatedabsmachine.Foruserconvenience,whileauserisnotexercising,themanagershowsoutside(i.e.,pass-through)usingthecameraplacedatthebackoftheVRHMD.4.1.1.1MuscleHighlightingInthescene,VRSceneManagervisuallyhighlightstargetmusclegroups,correspondingtoexercisetype(e.g.,seatedabs)andpersonalgoal(e.g.,muscleshaping,betterspinestructure),retrievedfromexercise/personalgoalpresetdatabases.Forexample,intheUIexampleofFigure4.1,theapppaintsobliqueabdomensmusclespurple,atwhichusersshouldgivetheirattentionduringexercise.ThisfeaturehypothesizesvisualhighlightoftargetmusclewillleadtogreaterMind-muscleconnection(MMC)[30].MMCisapracticaltermdenotingastrategytogiveattentionalfocustoconsciouslydirectneuraldrivetothemuscle,usuallyachievedthroughimagination[30].IncreasedMMCleadstogreatermuscleactivation,whichpotentiallyincreasesmuscleproteinaccretion[32,33].4.1.1.2SmartVisualQualityGuideOneimportantfeatureofVRSceneManageristoprovideUIdevelopersusefulsuggestionstoachievehigherperceivedVRrenderingquality,whilereducingcomputationworkloadof14Figure4.2:Exampleofthreetqualityoptions.CPUandGPU.AlongwiththetoolofUnity1,thesystemprovideUIdeveloperscomputationloadandsuggestionstogivequalitygraphics(i.e.,detailedmodelswithmorepolygonsandshaderonthetargetmusclegroup,whiledrawingtheremainingbodypartsusingmoderatetolownumberofpolygonsand/orlessexpensiveshaderThisdevelopment-sidesuggestionaimstoenablehigherperceivedvisualquality,whilere-ducingenergyconsumptionandheatgeneratedbyVRHMD.Forexample,Figure4.2showsabalancedoptionofqualityandcomputationload(inthemiddle),bymixingastandardshaderandhighqualityshader(e.g.,abumpedspecularshaderwithlightmap).LaterweevaluatehowthisapproachcontributestoreducedheatgenerationoftheVRHMD,whichisbforexercisers'experienceaswellaspreventingoverheatterminationoftheVRapplication.4.1.2Real-timeVirtualBodyAnimationVirtualBodyAnimatorgeneratesavirtualbodyofauser,thatfollowstheuser'smovementduringtheexercise,basedonthereal-timeprogressinformationprovidedbyMotionProgressDetector.Itenablestheusertointuitivelyunderstandthepaceandprogressofthecurrent1Awidely-usedgameandinteractiveappdevelopmentframework.15repetition.Afterseveralroundsofpilottestindevelopmentphase,weputamirrorinfrontofthevirtualbody,tobettershowuserstheirvirtualbodyanditsmovement,whichisturnedouttobeeinourcasestudy.4.1.3InformativeHead-UpDisplay(HUD)ExerciseInformationVisualizercollectsreal-timeexerciseinformationfromthebackendarchitectureanddisplaystheinformationontheVRscreen.ItemploysaHUDinterfaceinalocation,indicatingrichexerciseinformationincludingrepetitioncount,pace,phaseinformation,andquality.Italsoshowswhetherthepaceofexerciseistoofastorslow,incomparisonwiththerecommendationofAmericanCollegeofSportsMedicine(ACSM)[8].Therecommendationprovidespaceguidelines,forexample,forfast(lessthan2secondsperrepetition)andmoderate(2-4seconds)paceofstrengthtraining.Also,itprovidespacebreakdownfortwophasesinarepetition:eccentricandconcentricphases,forexerciserstomaximizetheirperformance[17].4.2Real-timeExerciseAnalyticsEngine4.2.1DataAcquisitionandPreprocessingWeuseCC2650STKsensortagdevicedevelopedbyTexasInstruments(TI)[7]toequipexercisemachineswithsensingcapabilities.SensorTagisTI'sstate-of-the-artIoTdevicethatintegrateshigh-performancesensorsandlow-powerwirelesscommunicationinaminiatureformfactor.Inthiswork,weusetheon-board3-axisaccelerometerand3-axisgyroscopewithasamplingrateof10Hztocapturemachineexercises.Afterweapplyamovingaverage16withlengthof10tosuppresshighfrequencynoises.Figure4.3:TheminiatureSensorTagdeviceandacustomized3Dprintedcasewithanembeddedmagnet.Tohelpexerciserseasilyattachthesensortagtoexercisemachines,wehavedesignedand3Dprintedaplasticcase(1:792:640:55inch)withanembeddedmagnettohostthesensortag.Withthemagnet,thetagcanbeeasilyyetattachedtoexercisemachines(SeeFigure4.3).4.2.2ExerciseProgressTracking4.2.2.1RepetitionSegmentationandCountingThegoalofrepetitionsegmentationistosegmentthestreamingsensordatasothateachsegmentcontainsonecompleterepetitionoftheperformedmachineexercise.Sinceausercanplacethesensortagonexercisemachinesinntwayswhichleadstotorientations,onestraightforwardschemeistoderivetheorientation-independentaccelerationmagnitudesignalandthenapplypeakdetectionontopofittosegmentexerciserepetitions.However,suchschemeisunsuitableinthecontextofmachineexercisesbecausetmachineexerciseswillhavetnumbersofpeaksandvalleysineachrepetition.Asanexample,Figure4.4(a)andFigure4.4(c)illustratethethreeaxesoftheaccelerometersignal17aswellasthecorrespondingaccelerationmagnitudesignalofthreerepetitionsofPulldownandSeatedAbsrespectively.Asshown,withineachrepetitionsegment,theaccelerationmagnitudesignalofSeatedAbshasonepeakandtwovalleyswhilePulldownhasthreepeakandtwovalleys.Withoutknowingthemachineexercisetypeaprior,bysimplyapplyingpeakdetection,onerepetitioncanbesplitintomultiplesegments.Inthiswork,wedesignaPrincipalComponentAnalysis(PCA)basedschemetosegmentrepetitions.ThekeyobservationbehindourschemeisalsoillustratedinFigure4.4.Spcally,Figure4.4(b)andFigure4.4(d)illustratetheprincipalcomponent(PC)extractedfromthethreeaxesoftheaccelerometersignalofSeatedAbsandPulldownrespectively.Weobservethateventhoughtheexercisetypeist,eachrepetitionintersectsthemeancrossinglineofthePCsignalexactlytwice.Thesameobservationholdstrueforallthe12targetedmachineexercises.Thisisbecauseonerepetitionofanytypeofthetargetedmachineexercisesconsistsofoneconcentricphase(i.e.,muscleshortening)andoneeccentricphase(i.e.,musclelengthening).AndthePCreliablycapturesbothtwophases.Basedonthekeyobservation,ourPCAbasedrepetitionsegmentschemeextractsthePCofthe3-axisaccelerometerdataandthemeancrossingpointofthePC.Second,ourschemeoutwhetherthePCisgoingdownwardorupwardatthemeancrossingpoint.Ifgoingdownward,thelowestminimabetweeneverynon-overlappingpairofmeancrossingpointsisthepeakoftherepetitionandthetwoclosestmaximaontheleftandrightsideofthetwomeancrossingpointsarethestartandendpointoftherepetition.Ifgoingupward,thehighestmaximabetweeneverynon-overlappingpairofmeancrossingpointsisthepeakoftherepetitionandthetwoclosestminimaonitsleftandrightarethestartandendoftherepetition.Afteranewrepetitionissegmented,thenumberofsegmentedrepetitionswillbecounted18Figure4.4:Theillustrationoftheprincipleoftherepetitionsegmentationalgorithm.(a)and(c):3-axisaccelerometerdataandthecorrespondingaccelerationmagnitudesignalofthreerepetitionsofPulldown(a)andSeatedAbs(c).(b)and(d):theprincipalcomponent(PC)extractedfromthethreeaxesoftheaccelerometersignalofPulldown(b)andSeatedAbs(d).andupdatedinrealtime.4.2.2.2MotionProgressDetectionToenableengagedinteractionbetweentheexerciserandthevirtualavatar,JARVISneedstotracktheprogressoftheexerciserwithineachrepetitioninrealtime.However,providingtheexactprogressstatuswithineachrepetitioninrealtimeisnotpossiblebecausetheexactprogressstatuscanonlybeavailableafterseeingthecompleterepetition.Asanalternative,wedesignamotionprogressestimationalgorithmtoprovideareasonableestimationabout19theprogressstatuswithineachrepetitioninrealtime.Sp,ouralgorithmusesthevaluesofthePCsignaltoestimatetheprogressstatus.Theprogressstatusstartsfrom0%andendsat100%withastepof10%.ThespvaluesofthePCsignalthatcorrespondtothosestatuspercentagesareobtainedbypreviouslyseenrepetitionsastrainingdata.Duringreal-timeprogresstracking,ifthevalueofthePCfallsbetweentwostatuspercentages,thehigherpercentagewillbereportedastheestimatedprogressstatus.4.2.3ExerciseTypeRecognitionAftersegmentingthemachineexercisesintorepetitions,thesecondstageofthemachineexerciseanalyticsengineistoidentifythetypeofthemachineexercisewithineachrepeti-tion.Duetothetmechanicalconstraintsofexercisemachines,eachtypeofmachineexercisehasacertainform.Therefore,weleveragethisobservationandframethemachineexerciseidenproblemasaproblem.Asexplainedbefore,exerciserscouldplacethesensortagonexercisemachinesintwayswhichleadstotorientations.Tomakeouralgorithmorientation-independent,wecomputethemagnitudeofthethreedimensionalaccelerometerdataaswellasthemagnitudeofthethreedimensionalgyroscopedatawithineachrepetition.Basedonthesetwomagnitudes,wehaveextractedatotalnumberof28featureswhichhavebeenproventobeeforactivityrecognition[34].Table4.1summarizesthelistoffeatures.Finally,westacktheextractedfeaturesintoafeaturevectorandimportthefeaturevectorintothelinearkernelSupportVectorMachinefor204.2.3.1SensorLocationToincreaserecognitionaccuracy,weleveragethefactthatthesensortagcanbeattachedonanysteelpartofexercisemachines.Wecollectmotiondatafromtwolocationsfromeachmachine,andthebestcombinationoflocationsacrossthemachines,bytryingallpossiblelocationcombinations.Inthiswork,weemploy12machines,andthusthenumberofallpossiblesensorlocationcombinationsis4096(212).Ourcriteriainchoosingtwolocationsareasfollowing:(1)locationsshouldbeeasilyaccessiblebyusers,and(2)twolocationsononemachineshouldshowtrangeofmotion.Wetaggedlocationswithlargerrangeofmotionas`L'andoneswithsmallerrangeas`S',andlatercompareaccuraciesusingtlocationcombinations.4.2.3.2FeatureSelectionTothebestfeaturesetprovidingthehighestaccuracy,weutilizetheSequentialFloatingForwardSelection(SFFS)featureselectionalgorithm[13]toidentifyaminimalsubsetoffeaturesachievingthebestaccuracy.4.2.3.3Session-wiseVotingTomaximizetherecognitionaccuracy,weutilizea`voting'schemeacrossrepetitionsinthesamesession.Thatis,wetakethemajorityofrecognitionresultsfromallrepetitionsinonesession.Moreover,consideringtherequirementofon-line,real-timerecognition,welaterevaluatehowmanyrepetitionsarerequiredtoachievethehighestaccuracy.21MeanMedianStandardDeviationVarianceSkewnessKurtosisEnergyInterquartileRangeSpectralEntropyFirstOrderDerivativeSecondOrderDerivativeMagnitudeofAverageRotationalSpeedDominantFrequencyRMSSignalMagnitudeAreaTable4.1:Listoffeaturesformachineexerciserecognition.4.2.4ExerciseQualityAssessmentThestageoftheanalyticsengineistoassessthequalityofthemachineexercisesperformedbyexercisers.Assessingthequalityoftheperformedexerciseissubjective.Aprofessionaltrainerusuallyassessesthequalityofauser'sexercisebasedonherbodyposture,speedofdoingtherepetitionetc.Inourcase,wehaveonlymotionsensordatatoassessthequalityoftheexercise.Soweaimtoassessthequalityoftheperformedexercisesbasedonmotionsensordata.Particularlyweaimtocomparetheuser'sexercisemotiondatawithtrainer'sexercisemotioninordertooutthesimilaritywithtrainer'sdata.Basedonsimilarityscore,wegivefeedbacktotheuserabouthowwelltheyaredoingduringeachrepetition.Toachievethisgoal,werecruitedonemaleprofessionaltrainerandonefemaleprofes-sionaltrainerfromthecenterattheuniversity.Wecollecteddataofourtargeted12machineexercisesfrombothtrainersandusetheirdataastheguidemodelsformaleandfemaleusersrespectively.Therefore,weassessthequalityofparticipants'exercisesbycom-paringthesimilaritybetweenparticipants'exercisedatawiththeguidemodels.Althoughthequalityoftheguidemodelscanbefurtherimprovedbyincorporatingdatafrommoretrainers,weincludeonlyonemaleandonefemaletrainertoexaminethefeasibilityofourapproach.Inthiswork,wedesignamotiontrajectorybasedschemetomeasurethesimilaritybe-22tweeneachexerciserepetitionandtheguidemodels.Speci,thestepofourschemeistodividethesensordataofeachsegmentedrepetitionintoasequenceoftinywindowswhoselengthismuchsmallerthanthedurationofthecompleterepetitionitself(thedurationofacompleterepetitionrangesfrom3secondto5secondacrossttargetedmachineexercises.Thelengthofthetinywindowweuseis0.5second).Thenweextractanumberoffeatureswhichcapturetheintrinsiccharacteristicsofeachrepetitionfromeachtinywindowandstackthemtogethertoformalocalfeaturevector.Asaconsequences,eachrepetitionsegmentistransformedintoasequenceoflocalfeaturevectorswhichformsamotiontrajectoryinthefeaturespace.Basedontheextractedmotiontrajectory,wehavedevelopedatrajectorycomparisonalgorithmtoquantifythesimilaritybetweentwomotiontrajectories.Itshouldbeemphasizedthatthekeybofthemotiontrajectorytransformationtoourqualityassessmenttaskinoursystemisthatthemotiontrajectoryprovidesaineddescriptionsaboutwheretheuser'sexerciserepetitionfromthetrainer'sguidemodel.Assuch,ourplatformiscapableofprovidingveryconcretefeedbacktoexercisersonhowtoimprovetheirexercisequality.Belowwedescribethefeaturesweextractfromthetinywindowsandthedetailsofthetrajectorycomparisonalgorithm.4.2.4.1LocalFeatureExtractionWeextractelocalfeaturesfromeachtinywindow.Thesefeaturesareselectedbecausetheycaptureditaspectsoftheexercisequality.AverageofMovementIntensity(AI):AIiscomputedastheaverageofMotionIntensity(MI)whereMotionIntensity(MI)astheEuclideannormoftheaccelerationvector.AImeasurestheaveragestrengthleveloftheexerciserepetition.23VariationofMovementIntensity(VI):VIiscomputedasthevariationofMI.Itmea-suresthestrengthvariationoftheexerciserepetition.SmoothnessofMovementIntensity(SI):SIiscomputedasthederivativevaluesofMI.Itmeasuresthesmoothnessoftheexerciserepetition.AveragedAccelerationEnergy(AAE):AAEcalculatesthemeanvalueofenergyoverthreeaccelerometeraxes.Itmeasuresthetotalexerciseaccelerationenergy.AveragedRotationEnergy(ARE):AREcalculatesthemeanvalueofenergyoverthreegyroscopeaxes.Itmeasuresthetotalexerciserotationenergy.4.2.4.2MotionTrajectoryComparisonThegoalofthetrajectorycomparisonistoquantifythesimilaritybetweenthemotiontra-jectoryextractedfromanexerciser'srepetitionandthemotiontrajectoryextractedfromthetrainer'srepetitionofthesamemachineexercise.Assuch,theexercisequalityoftheexercisercanbederived.However,onechallengefortrajectorycomparisonisthattrajecto-riesfromtworepetitionsegmentsmayhavetlengths.Inthiswork,wedevelopthetrajectorycomparisonalgorithmbasedontheMultidimensionalDynamicTimeWarping(DTW)techniquetoresolvethisissue.DTWisanonlinearalignmenttechniqueformeasur-ingsimilaritybetweentwosignalswhichhavetlengths.Inourcase,DTWisusedtocopewithttrajectorylengths.Sp,letXdenotethemotiontrajectoryofthetrainer'srepetitionandYdenotethemotiontrajectoryoftheuser'srepetitionwhoseexercisequalityisbeingassessed:X=x1;x2;:::;xi;:::;xM(4.1)24Y=y1;y2;:::;yj;:::;yN(4.2)wherexiandyjrepresenttheithandjthlocalfeaturevectorinXandYrespectively;andMandNrepresentthelengthofXandYrespectively.DTWcompensatesforthelengthandtheoptimalalignmentbetweenXandYbysolvingthefollowingdynamicprogramming(DP)problem:D(i;j)=minfD(i1;j1);D(i1;j);D(i;j1)g+d(i;j)(4.3)whered(i;j)representsthedistancefunctionwhichmeasuresthelocalbetweenlocalfeaturevectorxiandyjinthefeaturespace,andD(i;j)representsthecumulative(global)distancebetweensub-trajectoryfx1;x2;:::;xigandfy1;y2;:::;yjg.ThesolutionofthisDPproblemisthecumulativedistancebetweenthetwotrajectoriesXandYwhichsitsinD(M,N)andawarppathWoflengthKW=w1;w2;:::;wk;:::;wK(4.4)whichtracesthemappingbetweenXandY.SincethecumulativedistanceD(M,N)isdependentonthelengthofthewarppathW,wenormalizeD(M,N)bydividingitbythewarppathlengthKandusethisaveragedcumulativedistanceasthemetrictomeasurethedistancebetweentrajectoriesXandYasDist(X;Y)=D(M;N)K(4.5)25Inthiswork,weusethecosinedistanceasthelocaldistancefunctionasd(i;j)=1xiT:yjjjxijj:jjyijj(4.6)Comparedtootherdistancefunctions,thebofusingthecosinedistanceisthatd(i,j)isbynatureintherange[0,1].Asaresult,theaveragedcumulativedistanceDist(X;Y)in(4.5)isalsointherange[0,1],andthuscanbeinterpretedasthedissimilaritybetweenXandYintermsofpercentile.Therefore,wecanthesimilarityscoreinpercentilebetweenXandYas:Sim(X;Y)=1Dist(X;Y)(4.7)Figure4.5:(a)and(b):themotiontrajectoriesofonerepetitionofLegExtensioninfeaturespaceofAI(a)andVI(b)withsimilarityscore98.21%.(c)and(d):themotiontrajectoriesofonerepetitionofBicepsCurlinfeaturespaceofAI(c)andVI(d)withsimilarityscore16.97%.26Figure4.5providesanillustrationofourmotiontrajectorybasedschemeforexercisequalityassessment.Inparticular,Figure4.5(a)andFigure4.5(b)showthemotiontrajectoriesofonerepetitionofLegExtensioninthefeaturespaceofAIandVIrespectively.Theredtrajectoriesrepresenttherepetitionperformedbythetrainerandtheblacktrajectoriesrepresentonehigh-qualityrepetitionperformedbytheexerciser.Thesimilarityscorebetweenthesetworepetitionsis98.21%.Incomparison,Figure4.5(c)andFigure4.5(d)comparealow-qualityrepetitionofBicepsCurlwiththetrainerguidemodel.Thesimilarityscoreinthiscaseis16.97%.Asdemonstrated,thehigherthesimilarityscoreis,thebettertheexercisequalityis.Weobtainsimilarresultsfortheothermachineexercises.27Chapter5SystemEvaluation5.1ExperimentalSetupWerecruited15participants(11malesand4females)whovolunteeredtohelpcollectdataandconductevaluationexperiments.Theparticipantsareuniversitystudents,researchersandwithagesrangingfrom22to48yearsold(=27:73;˙=6:65),weightsrangingfrom42kgto85kg(=60:51;˙=8:85),heightsrangingfrom152cmto189cm(=174;˙=6:50)andexperiencelevelscoveringnoviceandintermediatelevels.Theexperimentswereconductedatacenterontheuniversitycampus.Asmentionedearlier,toexaminetheimpactofsensorplacementlocationsontheper-formanceofoursystem,weattachedtwosensortagdevicestotwotlocationsoneachmachine.Figure5.1presentsanexampleofthetwotaglocationsontwomachines.Weenvisionthatinthefuture,everyexercisemachineproducedbymachinemanufacturerswillhaveastandardizedslottopluginthetag.Ourexperimentwillhelpmachinemanufacturersthebestlocationforthestandardizedslots.Duringdatacollection,theparticipantswereinstructedtoperformthetwelvetargetedmachineexercisesbyfollowingtheshortsetofinstructionsoneachmachine.Tocapturetheintra-subjectvariability,eachparticipantattendedthreesessionsofdatacollection.Ineachsession,theparticipantperformed10repetitionsforeachmachineexercise.Intotal,each28participantcontributed30repetitionsforeachexercise.Figure5.1:Theillustrationofdatacollectionsetupandsensortagplacementusinglateralraisemachine(left)andseatedabsmachines(right).Circled1and2indicatetheplacementoftwosensortagsoneachmachine.5.2PerformanceofRepetitionSegmentationandCount-ing5.2.1EvaluationMetricsWeevaluatetheperformanceofourrepetitionsegmentationschemeusingthreemetrics:MissRate(MSR):MSRisastheproportionofcasesthatourschememissestodetectarepetition.MergeRate(MGR):MGRisastheproportionofthecasesthatourschememergestwoormorerepetitionsintoonerepetition.29FragmentationRate(FR):FRisedastheproportionofthecasesthatourschemesplitsasinglerepetitionintomorethanonerepetitions.5.2.2EvaluationResultsofRepetitionSegmentationWeevaluatetheperformanceofourrepetitionsegmentationschemeonourfulldatasetof5400repetitionsof12exercisesfrom15participants.Table5.1showstheperformanceofourrepetitionsegmentationschemeforeachtypeofmachineexercises.IntermsofMSR,ourschemeachieveszeroMSRforallmachineexercises.IntermsofMGRourschemeachieveszeroMGRforallmachineexercisesexceptE08LegCurlwiththeMGRofonly0.11%.Finally,intermsofFR,ourschemeachieveszeroFRfor6outof12typesofmachineexercises.Amongtheother6types,thehighestFRisonly0.33%forE05TricepsPressandE10RowDeltoid.Takentogether,therepetitionresultsindicatethatourschemeisabletoachievehighlyaccurateandrobustsegmentationperformanceacrossalltypesofmachineexercises.5.2.3EvaluationResultsofRepetitionCountingBasedontherepetitionssegmentedbyourscheme,weachievetherepetitioncountingac-curacyof97.96%outofatotalof5400repetitions.TakingacloserlookatthecountingExerciseE01E02E03E04E05E06MSR(%)0.000.000.000.000.000.00MGR(%)0.000.000.000.000.000.00FR(%)0.110.110.110.000.330.00ExerciseE07E08E09E10E11E12MSR(%)0.000.000.000.000.000.00MGR(%)0.000.110.000.000.000.00FR(%)0.000.220.000.330.000.00Table5.1:Performanceofrepetitionsegmentation.30results,weachieveanaccuracyof99.81%forwithin1scenario(i.e.,1countcomparedtothegroundtruthwithinonesessionof10repetitions)and100%forwithin2scenario.5.3PerformanceofExerciseRecognition5.3.1ImpactofSensorLocationandVotingFirst,weexaminetheimpactofsensorplacementlocation,aswellasthesession-wisevotingmechanismonexerciserecognitionperformance.Thegoalistothebestsensortagdeploymentlocationonexercisemachines.Toachievethisgoal,wetrainedfoursetsofclas-1)frombothlocations;2)onlyfromlocationwithlargermovementrange,`L';3)onlyfromlocationwithsmallerrange,`S';and4)fromthebestlocationcombinations.Spcally,theclassieristrainedbybothlocationsmergedatfeaturelevel.Again,thefourthisdeterminedbybrute-forcesearchingamong4096(212)sensorplacementcombi-nations.Also,wetrainedanotherfoursetswithaforementionedvotingamongrepetitionsinthesamesession.TheevaluatedresultsarelistedinTable5.2byusingleave-one-subject-outcrossvalidationstrategy.Themaximumaccuracyof99.08%isachievedfromthebestlocationcombinationwiththesession-wisevoting.tlocationcombinationhelpstodistinguishbetweenexer-ciseswithsimilar(suchas-SeatedAbs,LegExtensionandLateralRaisehaveverysimilarmovementpattern)movementpatternusingtrangeofmotionfromtlocaiotnofthemachines.Ourfurtherinvestigationforthelatencyofthevotingmechanismthatthebestaccuracycanbeachievedquickly,after1.22repetitionsonaverage(SD=0.92).Bycomparingtheresultsofthesysteminalateralfashion,weunderstandthatthesystemperformsbetterwhenitusessession-wisevoting,whichmotivatesustousethistechnique31toincreasetheperformance.Second,bycomparingtheresultsinalongitudinalfashion,wethatbestlocationoutperformstherestthreewitharound10%pointThisimpliesthatourstudyofimpactofsensorlocationcanguidethemanufacturerandusertoandmarkthebestlocations.DatasetsWithoutsession-wisevotingWithsession-wisevotingPrecisionRecallAccuracyF-measurePrecisionRecallAccuracyF-measureAlllocations0.83250.826682.67%0.82950.89700.888788.88%0.8930Location`L'only0.81020.808180.82%0.80910.88540.880988.10%0.8831Location`S'only0.81140.812181.22%0.81170.88290.881588.17%0.8822BestLocations0.94320.943094.30%0.94310.99110.990799.08%0.9909Table5.2:Overallprecision,recall,accuracyandF-measurefordtsensorlocationsandvotingoption.5.3.2PerformanceofSubjectIndependentModelNext,weexaminetheexercisetyperecognitionperformanceofthesubjectindependentmodel.Figure5.2showsthattheaveragerecognitionaccuracyacrossallthemachineexer-cisesusingleave-one-subject-outcrossvalidation.Asshown,11outof15subjectsachieve100%accuracywhiletheother4subjectsachieveaccuracyof97.24%,97.22%,97.22%,and94.45%,respectively.Inaddition,theconfusionmatrixinTable5.3providesadetailedlookattheresultsintermsoftypesofmachineexercises.Asshown,10outof12machineexercisesachieve100%precisionandrecall.Takentogether,ourresultsindicatethatJARVIScanachievehighrecognitionperformanceacrossallthetargetedmachineexercisesinasubjectindependentmanner.32Figure5.2:Performanceofsubjectindependentmodel.5.3.3ImpactofNumberofFeaturesFinally,weexaminetheimpactofthenumberoffeaturesontheperformanceofexercisetyperecognition.Figure5.3showstheaveragerecognitionaccuracyrelatedtothenumberoffeaturesselectedbythesequentialforwardselection(SFFS)algorithmusingleave-one-subject-outcrossvalidation.WeuselinearSVMasourforfeatureselection.Asshown,therecognitionaccuracyincreasesingeneralasthenumberoffeaturesincreases.E01E02E03E04E05E06E07E08E09E10E11E12E0145100000000000E0204510000000000E0300451000000000E0400045000000000E0500004520000000E0600000450000000E070000020420010000E0800000004510000E0900000000450000E10000000000430020E1100000000004500E1200000000000451Table5.3:Confusionmatrixofexercisetyperecognition.33Theaccuracyreachesthehighestwhenusing23featuresoutofatotalof28features.Themostt23featuresare:Mean,Median,StandardDeviation,Variance,Skewness,Kurtosis,Energy,InterquartileRange,SpectralEntropy,FirstOrderDerivative,SecondOr-derDerivative,MagnitudeofAverageRotationalSpeed,RMS.Wealsoobservethattherecognitionaccuracyalreadyreaches98%whenonlyusing12features.Thisresultindi-catesthatbyusingasmallnumberoffeatures,JARVIScouldstillachievehighrecognitionperformance.Figure5.3:Accuracychangesovernumberoffeatures.5.4ValidityofQualityAssessmentToexaminethevalidityofourexercisequalityassessmentscheme,weusethetwotrainers'data.Duetothestrengthbetweenmaleandfemaletrainer,wecomparethemale/femaletrainer'sexercisedatatohis/herowndata.Sp,weaskedeachtrainertoperformthe12targetedmachineexercises.Eachmachineexercisewasperformed10repetitionspersessionforthreesessions.Giventhesedata,wecalculatethesimilarityscores34forallpairsofrepetitionswithineachmachineexerciseformaletrainerandfemaletrainerseparately.Weexpectthesimilarityscorestobehighbecausebothofthemareprofessionaltrainerswhoperformconsistenthigh-qualitymachineexercises.Figure5.4illustratestheaveragesimilarityscoresrelatedtotypesofmachineexercisesforbothmaintrainerandfemaletrainer.Asshown,averagesimilarityscoresarehighacrossallthemachineexercisesforbothtrainers.Inparticular,thelowestaveragesimilarityscoreis90.80%and88.25%formaletrainerandfemaletrainerrespectively.Thisresultindicatesthatourqualityassessmentschemeisabletoreliablyassesstheexercisequalityacrossttypesofmachineexercisesforbothmaleandfemale.Figure5.4:Validityofexercisequalityassessment.5.5SystemPerformanceInthissection,wereportsystemperformanceofJARVIS,includingprocessingtime,energyconsumption,andtemperaturechanges.Forthetests,weemploySamsungGalaxyS6smartphone,havingquad-core2.1GHzCortex-A57processorand3GBram,andrun-35TasksTimerequired(perrep.)(ms)Pre-processing0.012SegmentingRepetition0.018FeatureExtraction0.006(usinglibSVM)17.5QualityAssessment0.457Table5.4:Processingtimestatistics.ningAndroidOSv5.0.2(Lollipop),mountedtoSamsungGearVRHeadset.Also,weuseCC2650STKSensorTagasasensingdevice.5.5.1ProcessingTimeToexaminethecomputationaldemandofourexerciseanalyticsengine,wemeasuretheaverageprocessingtimeconsumedbyeachcomponentsoftheengine.Thecomputationalpipelineconsistsof5corestepsshowninTable5.4.Forthecomponent,weemploytheAndroidimplementationoflibSVM[2].Werunthe5setofprocessingtrialsusing100repetitionsoftexercisesdatafromtparticipants.Wecalculatedtheaverageprocessingtimeusingtheresultsfromthese5settrails.Table5.4showsthatouranalyticsenginecanrunfastwithminimumcomputationaloverhead.Amongallanalyticsenginecomponents,theexercisetyperecognitiontakesthemosttamountoftimeamongtheanalyticsenginecomponents.5.5.2EnergyConsumption5.5.2.1VRHMDEnergyConsumptionDuetotheimmobilityofoursetupasshowninFigure5.5,itisimpracticaltomeasurethereal-timeenergyconsumptionofJARVISingymsetting[19].Instead,inlabsetting,weletthesystemstillreceivedatapacketsfromthesensortag,andfedtheexerciseanalytics36enginethepreviously-collectedexercisedata.Wepre-loadedallrecordsinmemorytoelim-inateadditionalstorageaccessduringmeasurement.WeuseMonsoonPowerMonitordevice[4]tomeasureenergyconsumptionofcomputationandcommunicationtasks.Toexactlymeasuretheenergyconsumedforoursystem,weturnedirreleventservicesandcomponentsincludingGPS,WiFiandcellularservices.Wealsoshutdownallotherappli-cations.Wetheenergyconsumptionpercomputationalcomponent.Wemeasuredtheenergyconsumption5timesandwetaketheaveragevalueofthetrials.Table5.5showsthebreakdownofenergyconsumptionforeachcomponent.OurcalculationindicatesthatJARVIScanrun2.85hoursonGalaxyS6smartphone,whichhas2550mAhLi-Ionbattery.ForVRSynthesisFrontend,wetestedthreeversionsprovidingtgraphicsrenderingqualitylevels,whichwillbefurtherdescribedinthedevicetemperaturemeasurement.Thethreesettingsshowedupto8.45%intermsofenergyconsumption.Figure5.5:EnergyconsumptionmeasurementsettingwithSamsungGalaxyS7,GearVRandMonsoonPowerMonitor.37DeviceComponentCurrent(mA)Power(mW)PhoneBLECommunication61.46245.7ProcessingBackend17.6570.57VRFrontendLowQuality800.33198MixedQuality815.63259HighQuality867.93468SensorTag10HzDataTransmission3.96711.90Table5.5:Energyconsumptionstatistics5.5.2.2SensorTagEnergyConsumptionWemeasuredandcalculatedtheenergyconsumptionandestimatedbatterylifetimeoftheTICC2650STKSensorTag,poweredbya3V,240mAhlithiumcoincellbattery.WepoweredthesensortagbyMonsoonpowermonitorandletittransmitmotionsensordatatothesmartphoneat10HZ.Table5.5showsthedetails.Fromourcalculation,wethatthebatterycanlast60.5hours.5.5.3DeviceTemperatureandVisualQualityWemeasuredtemperaturechangesoftheVRHMDdeviceovertime,withthreetimplementationsettingsinrenderingquality:(1)High-Quality,(2)Mixed-Quality,and(3)Low-Quality(seeFigure4.2forexamplescreenshots).High-Qualitysettinghas155.8ktrian-glestorenderonaverageinthegamescene,withabumpedspecularshaderwithlightmap.Low-Qualityhas13.8ktrianglesinthegamescene,withastandardshaderforthevirtualbody.Mixed-Qualityhas30.1ktriangles,withincreasednumberoftrianglesandabumpedspecularshaderwithlightmapontargetmusclegroups(e.g.,abdominals).WeemployedthelogfeaturesusingAndroidDebugBridge(adb)shalltotracktemperaturechanges,andcollectedCPUandAmbienttemperaturevalues.Ourindoortestbedhadroomtemperatureofabout25degreeCelcius.WerantheVRapplicationfor30minutes.AsshowninFigure5.6,theCPUtemperaturemeasurementresultindicatedthatMixed-38Figure5.6:CPUtemperaturechangesovertime.High-Qualityoverheatedthephoneafter15min.Figure5.7:Ambienttemperaturechangesovertime.andLow-QualitysettingsallowedtheVRHMDhardwaretoruntheapplicationforatleast30minutes.Temperaturelevelsforthetwosettingswereunder70degreeafter30minutesofoperation.However,High-QualitysettingoverheatedtheCPUupto73degreein15minutes,andtheapplicationwasterminatedfordevicesafety.Ambienttemperatures(in39Figure5.7)inMixed-andLow-Qualityreachedto35degreeafter30minutes,whereasoneinHigh-Qualitydidin15minutes.Wecarefullysuggestthatitisimportanttohelpdevelopersestimatetheofvisualqualitytotemperaturechanges,andbelievethatourexperimentresultcanbeonehelpfulcaseforbettertemperatureestimation.40Chapter6CaseStudywithJARVISInthiscasestudy,weevaluatetheofourexercisetrackingplatformappwithques-tionnairesurveyandEMGsignalanalysis,aswellasinterviewforitsoveralluserexperience.Wesettheseatedabsasatargetexercise,duetoitsimportanceforhealthbUsuallyabdominaltrainingisconsideredtobuildtherectusmusclesaroundbellybutton,however,abdominalmusclesareverysensitivelyconnectedtoeachother,andtolumbarspineaswell.Itisknownthatabdominalmuscletrainingisbforlumbarstability,however,withoutcorrectguidanceitprimarilyactivaterectusabdominis,whileobliqueabdominismuscles(highlightedmusclegroupinFigure4.1)areconsideredtobemoreimportcontributorstolumbarstability[20].Onetiveinterventionforbalancedabdominaltrainingisverbalinstructions,whichshowedstatisticallyntncesintermsofabdominismuscleactivationmeasuredbysurfaceelectromyography(sEMG)[14].Thegoalofourcasestudyistoseewhetherourmusclehighlightingfeaturecanhelppeopletoconcentrateonaparticularmuscleandwhetheritcanhelppeopletoactivatethatparticularmuscle.Inthiscasestudywecapturethemuscleactivitydatausingasurfaceelectromyography(sEMG)sensingdevicetocaptureaparticularmuscleactivity.416.1StudyMethods6.1.1ParticipantsandInstrumentationWerecruited10studentsthataretfromtheparticipantsforearlierdatacollec-tion,throughdepartmentemaildistributionlists,aswellason-siterecruitmentatacenterintheuniversitycampus.5studentswerehiredthrougheachwayofrecruitment,respectively.Allparticipantshadpriorexperiencewiththeseatedabsmachine.EMGdatawerecollectedusingTrignoWirelesssEMGSystem[1],whichconsistsofonebasestationandmultiplewirelesssEMGsensors.EachsEMGsensorhassignalbandwidthof20-450Hz,transmissionrangeof40metersandsamplingrateof4000samples/secwith16-bitresolution.Thebasestationiscapableofstreamingdatatoamanufacturer-providedanalysissoftwareoverUSBwiredconnection.Thetransmitteddatawassavedintocomputerstorageforanalysis.6.1.2StudyDesignandAnalysisFollowingtheabdominalexercisestudy[14],4sEMGsensorswereplacedontheleftanteriorabdominalwall,atupper/lowerrectusabdominis(URA/LRA),andexternal/internalobliqueabdominis(EOA/IOA).(SeeFigure6.1)Participantswereinstructedintwoconditions:oneconditionwithourexercisetrackingapplicationemphasizingactivationofobliqueabdominis,andonecontrolledconditionwith-outtheapplication.Priortotheconditionwiththeexercisetrackingplatform,participantsweretoldaboutthebasicuserinterfacesofthesystem,andaskedtotakealookattheVRspacewiththeirvirtualbody.Inthecontrolledcondition,theparticipantsareaskedtoperformtheexercisenaturally,asiftheyusuallywouldatgyms.Foreachcondition,42Figure6.1:IllustrationofsEMGsensorplacementatupper/lowerrectusabdominis(URA/LRA),andexternal/internalobliqueabdominis(EOA/IOA).theparticipantsperformedtwosetsofseatedabsexercisewithtenrepetitions,respectively.sEMGactivitywasrecordedfromthebeginningtotheendofeachset.Theparticipantswereallowedtorestforuptotwominutesbetweensets.Thesequenceoftheconditionsarebalancedacrosstheparticipants.Aftereachcondition,theparticipantswereaskedtorespondto19selectedandadjustedquestionsfromintrinsicmotivationinventory(IMI)[26,29],whichisawidelyusedandextensivelyvalidatedsetofquestionnairetoevaluateuserexperiences.AmongemajorcategoriesofIMI,wechosethreecategoriesrelatedtotheapplicationandexercisecontext:interest/enjoyment,perceivedcompetence,andvalue/usefulness,having7,5and7questionsrespectively.Theresponseswerecollectedin7-pointLikertscaleandthenaveragedineachcategoryforstatisticalanalyses.Asemi-structuredinterviewfollowedaftertheendofstudy.Weaskedtheparticipantsabouttheiroverallexerciseexperiences,includingtheirperceptionabouttheVRapplicationanditsinterface,theoftheapplication,aswellastheirsuggestionsindesign.All43interviewdataweretranscribedandanalyzedbyopencoding[31]andaxialcoding[12]todiscovercommonthemesandpatterns.6.2Results6.2.1EMGactivityFigure6.2showsrepresentativesEMGdatafromindividualtrialsperformedunderconditionswithandwithoutVRrespectively.InthisitisevidentthatsEMGamplitudesoftheabdominismuscleschangeinresponsetorepetitions.ThesechangesinmuscleactivityareinconcertwiththechangesingroupmeansEMGvaluesshowninFigure6.3.Figure6.2:WithoutVRvs.WithVRintermsoftheroot-mean-square(RMS)valuesofsurfaceEMGsignals.Figure6.3providessummarydatafromthetwoconditions,withgroupmeansandstan-darddeviationsofthenormalizedEMGactivityofeachofthe4musclegroups.TherewasatofVRapplicationforthetwoobliquemusclegroup(p<0:01andp<0:05forextenalandinternalobliqueabdominisrespectively),whichshowseof44theVRapplicationinactivatingtargetmusclegroups,comparedtothepreviousstudyintheliteratureoforthopaedicandsportsphysicaltherapy[14].Figure6.3:Meannormalizedelectormyographyvaluesforfourmusclegroups.With-VRtlytfromWithout-VRatobliqueabdominis(**p<:01;*p<:05).6.2.2Userexperience:QuestionnairesurveyFigure6.4showssummarydatafromtheresponsesofIMIquestionnairesurvey.Forallthreecategories,therewasatofVRapplication(p<0:05).In-depthanalysisrevealedthatthestudentsrecruitedthroughthemailinglist,whichhadlessexperienceswithstrengthtrainingmachines,showedgreaterinallthreecategories,thanoneshiredon-siteatthecenter.Thisgivesahintforrelationshipbetweenusers'experiencelevelandengagement.Thatis,theVRapplicationhasapotentialinmotivatingnoviceorless-experiencedexerciserstoengageinstrengthtraining,implyinganopportunityofbroaderimpacttopublichealth.45Figure6.4:IMIsurveyresultsforthreecategories.With-VRtlytfromWithout-VR(*p<:05forallthreecategories).6.2.3Userexperience:InterviewanalysisTheparticipantssharedtheirperceptionsaboutJARVISapplicationincludingitsbandusefulness,aswellastheirsuggestionstoprovide\sweet"(P7)andmoreenjoyableexerciseexperiences.Inoverall,theyenjoyedandvaluedtheapplicationanditsfeatures,whichisconsistenttotheearlierIMIsurveyresults.6.2.3.1VirtualSelfanditsMovementTheparticipantslikedJARVISwhichenabledthemto\seemyselfmoving"(P8).SeeingtheirvirtuallyrepresentedbodyinVRhelpedthemindoing\theexercisealotmorecor-rectly"(P6)andinbetter\focusingon(my)body"(P6).Someparticipantstalkedaboutthediscrepancybetweenrealselfandvirtualself,intermsoftheirappearance:\[ThemaninVR]wasnotexactlythesame(tome),butthemotionIwaslikethesamewithhim."(P4)Interestingly,thediscrepancyshowedapositivepotentialforusermotivation:\lookingatthemusclemanreallyattractedmetofollowhim."(P1)Inoverall,`virtualself'contributed46to\moreenjoyable"(P8)exerciseexperience,consistenttotheearlierIMI-enjoymentsurveyresults.6.2.3.2ofReal-timeHUDTheparticipantslikedthereal-timeinformativeHUDuserinterface,providing\consistentfeedbackabouttimeandquality"(P4):\Youcouldbettersimplybecauseittellsyoutheoftheworkoutyouaredoing."(P7)Theyrecalledhowharditwasto\counttherepetitionswhilefocusingontheexercise"(P2),andtalkedthat\(easily)seeinghowmanyrepetitionsIhave"(P3)wasveryhelpful.Also,withthedurationandqualityinformationtheycould\runcleanrepetitions"(P9),andfoundtheappplayedaroleasa\bindertoimprove(their)exercisequality"(P3).6.2.3.3ofMuscleHighlighterInprovidingvisualguideforrequiredmusclegroupstoactivate,theparticipantsfoundthatthemusclehighlighterfeaturewashelpful,otherwisethey\wouldneverknowhowtodoitproperly"(P8):\Whatthemachineistargetingisthemuscle,sohighlightingreallyhelpedmetoperformwell(and)instructedme."(P1)Itnaturallyinstructedtheparticipants\howtofocus(their)abdominalpart"(P8),andassistedthemtobetter\conceptualizethewayofexercise"(P10),contributingtotheoftheappvalidatedearlierinmuscleactivationanalysis.6.2.3.4PointsofImprovementsandDesignSuggestionsTheparticipantssuggestedinterestingpointsofimprovementsforJARVISapplication.Oneparticipanttalkedaboutasoftcompetitionwithanotherpacemakeravatarorfriendsonline47intheVRscreen,relatedwithohlert,awell-knowntheoryaboutbofcompe-titionintheliteratureofsportsscience[24,16]:\IwouldmakeagoaltoreachtokeepitupwithaguyintheVR."(P8)Theyalsowantedtotrytheapplicationfortmachineswhichmaybealsob[itwouldbehelpfulfor]otherpopularexerciseIknowlikechestpress.Thatwillbeveryhelpfulbecausealotofpeopledoneedbetterforminthat.Sothosecompoundexercisewouldhelppeople."(P10).48Chapter7ConclusionandFutureWorksInthisthesis,wepresentedthedesign,implementationandevaluationofJARVIS,aof-its-kindsensingsystem,basedonaminiatureIoTsensingdevicecombinedwithamobileVRheadsettoenableimmersiveandinteractivegymexerciseexperience.JARVISleveragesimmersiveVRtechnologytoprovideuserswithaninteractiveexerciseanalyticsforma-chineexercisesinrealtime.TherealizationofsuchaimrequirestheVRheadsettoretrievemultidimensionalinformation(i.e.repetitioncount,exercisetype,durationofeachrepe-tition,qualityscoreetc.)ofthemachineexerciseperformedbytheexerciserinreal-time.JARVISachievesthisbyattachingminiatureIoTsensingdevicesongymmachinestocap-turemachineexercisesaswellasdevelopingonlineanalyticsalgorithmsandVRperformanceoptimizationtechniquestotrackexerciseprogressandassessexercisequalityinrealtime.ByconvertingtheexerciseprogressandqualityinformationtoVRinputs,JARVIScreatesatrulyimmersivegymexerciseexperiencewithavirtualavatartoguidemachineexercisesinahighlyinteractivemannerandprovidingtheexerciseanalyticstotheuserinrealtimeinaconvenientwayusingVRtechnology.Throughextensiveexperiments,wehavedemonstratedthatJARVIScouldprovidereal-timemachineexerciseanalytics.Ourreal-worlddeploymentstudyindicatesthatJARVIScouldprovideanengagingandeexerciseassistancetoexercisersforstrengthtraining.OurfutureworkincludesevaluatingtheofJARVISwithbroaderpopulation,de-49signingmoreentertainment-orientedapplicationsandgamesforgymsettings,andextendingJARVISasafull-pledgedgymcontextmonitoringandmanagementplatform.WeenvisionthatJARVISwilleventuallygymactivitiesasfullyimmersiveVRexperi-ences.50BIBLIOGRAPHY51BIBLIOGRAPHY[1]Delsysinc.Inhttp://www.delsys.com/.[2]libsvm.Inhttp://www.csie.ntu.edu.tw/cjlin/libsvm/.[3]Microsofthololens.Inhttp://www.microsoft.com/microsoft-hololens.[4]Monsoon.Inhttps://www.msoon.com/LabEquipment/PowerMonitor/.[5]Oculusrift.Inhttps://www.oculus.com/en-us/.[6]Samsunggearvr.Inhttp://samsung.com/galaxy/wearables/gear-vr.[7]Sensortagfromtexasinstruments.Inhttp://www.ti.com/.[8]K.Adams,E.Cafarelli,A.Gary,C.Dooly,S.Matthew,S.J.Fleck,A.C.Fry,J.R.R.U.Newton,J.Potteiger,M.H.Stone,N.A.Ratamess,andT.Triplett-mcbride.Progressionmodelsinresistancetrainingforhealthyadults,2009.[9]K.-H.Chang,M.Y.Chen,andJ.Canny.Trackingfree-weightexercises.Springer,2007.[10]Y.Degtyarev,E.Cuervo,andD.Chu.Demo:Irides:Attainingquality,responsivenessandmobilityforvirtualrealityhead-mounteddisplays.InProceedingsofthe13thAnnualInternationalConferenceonMobileSystems,Applications,andServices,pages443{443.ACM,2015.[11]H.Ding,L.Shangguan,Z.Yang,J.Han,Z.Zhou,P.Yang,W.Xi,andJ.Zhao.Femo:Aplatformforfree-weightexercisemonitoringwithInProceedingsofthe13thACMConferenceonEmbeddedNetworkedSensorSystems,pages141{154.ACM,2015.[12]K.HoltzblattandS.Jones.Contextualinquiry:Aparticipatorytechniqueforsystemdesign.Participatorydesign:Principlesandpractices,pages177{210,1993.[13]A.JainandD.Zongker.Featureselection:Evaluation,application,andsmallsam-pleperformance.PatternAnalysisandMachineIntelligence,IEEETransactionson,19(2):153{158,1997.52[14]G.M.KarstandG.M.Willett.ofspexerciseinstructionsonabdominalmuscleactivityduringtrunkcurlexercises.TheJournaloforthopaedicandsportsphysicaltherapy,34(February2001):4{12,2004.[15]L.Karvitz.ExerciseMotivation:WhatStartsandKeepsPeopleExercising.UniversityofNewMexico,2011.[16]O.ohler.UberdenGruppencrirkungsgraddermenschlichenorperarbeitunddieBedingungoptimalerkollektivkraftreaktion[Humanphysicalperformanceingroupsandconditionsforoptimalcollectiveperformance].IndustriellePsychotechnik,4:209{226.,1927.[17]P.F.LaChanceandT.Hortobagyi.ofCadenceonMuscularPerformanceDuringPush-upandPull-upExercise,1994.[18]C.L.Lox,K.A.M.Ginis,andS.J.Petruzzello.Thepsychologyofexercise:Integratingtheoryandpractice.2006.[19]A.MaitiandG.Challen.Themissingnumerator:Towardavaluemeasureforsmart-phoneapps.InProceedingsofthe16thInternationalWorkshoponMobileComputingSystemsandApplications,pages99{104.ACM,2015.[20]M.I.MillerandJ.M.Medeiros.Recruitmentofinternalobliqueandtransversusab-dominismusclesduringtheeccentricphaseofthecurl-upexercise.Physicaltherapy,67(8):1213{1217,1987.[21]D.Morris,T.S.Saponas,A.Guillory,andI.Kelner.usingawearablesensortorecognize,andcountrepetitiveexercises.InProceedingsoftheSIGCHIConferenceonHumanFactorsinComputingSystems,pages3225{3234.ACM,2014.[22]B.J.Mortazavi,M.Pourhomayoun,G.Alsheikh,N.Alshurafa,S.I.Lee,andM.Sar-rafzadeh.Determiningthesinglebestaxisforexerciserepetitionrecognitionandcount-ingonsmartwatches.InWearableandImplantableBodySensorNetworks(BSN),201411thInternationalConferenceon,pages33{38.IEEE,2014.[23]M.Muehlbauer,G.Bahle,andP.Lukowicz.Whatcananarmholsterwornsmartphonedoforactivityrecognition?InWearableComputers(ISWC),201115thAnnualInternationalSymposiumon,pages79{82.IEEE,2011.[24]K.a.Osborn,B.C.Irwin,N.J.Skogsberg,andD.L.Feltz.Theohlerct:Motiva-tiongainsandlossesinrealsportsgroups.Sport,Exercise,andPerformancePsychology,1(4):242{253,2012.53[25]T.Park,I.Hwang,U.Lee,S.I.Lee,C.Yoo,Y.Lee,H.Jang,S.P.Choe,S.Park,andJ.Song.Exerlink:enablingpervasivesocialexergameswithheterogeneousexer-cisedevices.InProceedingsofthe10thinternationalconferenceonMobilesystems,applications,andservices,pages15{28.ACM,2012.[26]R.W.PlantandR.M.Ryan.Intrinsicmotivationandtheofselfconsciousness,self-awareness,andegoinvolvement:aninvestigationofinternallycontrollingstyles.JournalofPersonality,53(3):435{449,1985.[27]A.Rizzo,J.Difede,B.O.Rothbaum,J.M.Daughtry,andG.Reger.Virtualrealityasatoolfordeliveringptsdexposuretherapy.Post-TraumaticStressDisorder:FutureDirectionsinPrevention,Diagnosis,andTreatment,2013.[28]C.Rojek.Leisureandculture.MacmillanPress,2000.[29]R.M.Ryan,J.P.Connell,andR.W.Plant.Emotionsinnondirectedtextlearning.LearningandIndividualences,2(1):1{17,1990.[30]B.J.SchoenfeldandB.Contreras.AttentionalFocusforMaximizingMuscleDevelop-ment.StrengthandConditioningJournal,38(1):27{29,2016.[31]A.StraussandJ.Corbin.Basicsofqualitativeresearch:groundedtheoryprocedureandtechniques,volume13.Sage,London,1990.[32]T.Wakahara,A.Fukutani,Y.Kawakami,andT.Yanai.Nonuniformmusclehyper-trophy:Itsrelationtomuscleactivationintrainingsession.MedicineandScienceinSportsandExercise,45(11):2158{2165,2013.[33]T.Wakahara,N.Miyamoto,N.Sugisaki,K.Murata,H.Kanehisa,Y.Kawakami,T.Fukunaga,andT.Yanai.Associationbetweenregionalinmuscleac-tivationinonesessionofresistanceexerciseandinmusclehypertrophyafterresistancetraining.EuropeanJournalofAppliedPhysiology,112(4):1569{1576,2012.[34]M.ZhangandA.A.Sawchuk.Afeatureselection-basedframeworkforhumanactivityrecognitionusingwearablemultimodalsensors.InProceedingsofthe6thInternationalConferenceonBodyAreaNetworks,pages92{98.ICST(InstituteforComputerSci-ences,Social-InformaticsandTelecommunicationsEngineering),2011.54