t. . 12 “ 0.99.4693 ‘3 .3. O. 2 . t . u d I. . ..C...).- .....lal . . 1...”..érr..... . am? ”$3.3: .. n4... 2. 35”“ m..fi.....2.vf.. 22.33: ..... ......fl..r.r...r.5....... . - - . - 2 I O I ‘ I. . - . v t 11'! .... r3 amiatnl 3‘. I’mhfl...l.¢c .3’. unfithhlnhlcfifil.’ .§! 0.0... v . ....t.‘a.':.~h?o 3.30.2.3... 37.._¢..2J1. .v 2 . _. . 1: . . .. ., . . .. . = ....o . ..v . guns-“or.- Ifltf. \ ....I 3.. . o. 33"?! 3.0376?! tzhffouflvun.’ m... Havana...huuwnru§umrdSpill. «...... L..._Ju.o.o.“....h._t.”nu-......Hvunnv.) .0. “u o . ‘ . t....fi.lm‘.1......_.‘ 4.233365 ... r... 1.. .. . 3.. v I . u 0‘ u . 3..‘ r. . . o . . .o . I... ‘1 ....IV...u».Nhtoh..-..Wlhon1.1.“21Jo29-‘uuvn. . ..u...~£‘!'|¢!..¢l¢.hdfil~ IHQI'QHHQCW. . . .. 2 . :(J. ...".hprltt. . ...-...!!lvx....! .. . ...... ..14“...L..“.Jnuer.nav.r. ......Ohflrhué. “1.3“”. $&.m.81‘. . . $2.... . . WV; .1 2.. .2 . i . L“. r ... v . a . . . . .I. 0 . . .. . A , .' . . . . .. . l. .. .... . .... t...l... . l.-.uu.vlu.....oulf1.. 9”. . I . . . . . ...-...: a. 0.. ...-1 .0... :0 .9 1.... ..f....1.|..n .... .... .. 1.7.3.. J..J=¢..Q~. ‘91. I . wk}... 1. wt. . . ....e: .... 4.0: t . . . ... 2'1: .. s‘o‘. 2 a. . . . .i?! . .0. ... ... A .3638 .... . .4 ~ . .37.}..‘32’2 ..l ...? .... ...?O..: 2‘... ...t H Cur ... ...cl ...: 9... 3...... .... ..9..q...$.¢...0 . :23... ... .... ‘93....no 2 ... ... ......nl ii... . .1: ...... .23.... .I4 .. ...... 2...... . ...-.2 I . .... ...... .... .. ... . 1......- ......|o.21..o.o.......30.. .7002... ...I ...-.0... .....1..¢l..2’. ...!!! 18223.0 . . . 21.1.2331...):99..90.c1’!¢0.- O a . .0 . . .. 2Q .. B. :1“ b . 3..lo..uft'.ul..r.¢Il-! ......'l .l a v .9 ...... 311.?3 Il‘fluu‘ ‘Q‘fi‘ . V . A . . ...-...,qu’... I «139* . u .0. ......I?1........l..l.._v.1.fi..u L‘s-.... .. ... t n . .... . 9.3.. I. 2r... mimm??? w...»........... 2.... 2-...-. ‘3 o .v- . . . n . . no ’0. 3r. . o ' 1....— l... ' I 7'93: . ¢Q§Jcououytu o. l. ‘3. «c any”? sunk}, .1..W“uu....‘....3 , 329 I. 29?: 2... flooocavo‘o... . ....Jogofn... 1.1!... . c I a. n... a 3‘. AgrOoA fit...‘ (or)... ... 3.0.93.0}... . . ...... 0-,.‘n...i \‘O...’ . b1! 3"! .R h“ V Ctr-x. ....s. on. :lC47....-¢ S ... o. . . . at I. 5 90...! .. c . II . no! ....CCi-II l r ...... . I}... ......- 3‘ . . . o. I .o o .fi':-¢I:.f‘ ......ol... ...? . .v..v....3§.. 20....cu. .'!¢’..:..Qo~ .... ...)..1’0 . "2‘ ‘§ 0 . ... ’ ‘3‘ 3342.. o 3“. “or: . o .o . n .2 5"»... ‘A5. L‘\ 3‘ lug. . .’¥‘{”¥:l “mo-”...“...v p 001034“! J“; ‘8’... a”. QJ‘v to...” I’l‘...‘.t~.s-Q.Coti .. ..o.‘n~"’..au.:fll $.00 . . i. on ...-'8 n to?! ".0.o 7'."- : .cC!ao’ c. ....3- in f..- i—‘t..‘a. .NMW. «1.30....» o S .0. n 22???? .....w z... .... n... 222122.... . ....Qaafiu £33.32... Lunefid. ...... .2... daurfifln 1......2 ... ...... .5... a. u. I. . . a . . . . 3“”: 2 ”— ..‘cwhool... ”‘3 “D. buyfiuniy Vic: 40" ‘I ..rfi. .onggol‘uzxa. .11.): 8......0 C........o.-1v6.o»‘( , .... . . 00.....4‘V3tnfitsilfifrfiuf... . 3.09.1. . “qu’wy 3.1. u .13”- o“.,liuwo»‘ulun..oa)lo‘. ...—"J.”ISJC ...!!M...a:$¥‘0: Iowan}... . . . . . . 2 ......o...’ I! o «I... .0. .31.. o. ......3... I ..v 1.. ...-...».le In...c.}.’. . .31'3. ......onp‘ 41.131 ... .v 0.47-. . . n... 2 iv.“- A... .31.” . ..VI.-....v¢qb:..ay'. ....» . . . . . . o. ....l... 5...... .325 .02t. ’7‘. 35.34....” 5.52”“. ... '3 J . a““bol3¢.u....oudufl"twu ....” do.“ ..L‘!....¢1.uafi. «L o. . 3|...»I... ‘8’. .I....u.!.vo . . . ......I: . .o . ...: .o c.‘ . 0...... . . 2... .. .F‘}. a u. . 923v k...3.. coon-... .... I. .t - .3.!!I\T"o ...... . .....Dtrnv'..o.ov&.’.105... . . . . . . . ...}..J! (90.1.. v!..y.. .... .6. 00.3! . . (1.!!!3. K X..\0918 stitiluo... . ...-...i3932539 :- . . . . . . 2 2 . . .31.).-. 1.8!...f. «:0. It. .1393. . . 1:3..$o{.1;90. 3.20.!!!" . . . .. 22......0... ...ofi.....wb.-..::’O. .30! “o. a... . . . , I 391‘:?0."..9...O..t .... v.8)!a3fpv; p! 2.95.... .13? ..I~ 80...}... . 2 . In... .... u 0...: 0.393h. ...!s. 7 33.093... . .43.. .2— . IN: ‘4 ... Jinn. I . .v ...... ‘9. ‘5. c1... ...... Jr...'.. 3 5,3! 2 4". I 1.. I . (...-u: . . 2....» 2222353... “Egan: kit... ....- ..‘19‘31. W6. . Q ..- ha... .0 ‘0: r... ’0 . 2. Ugo-.... 3...... OJ: .00.... 3 r2... . :3»... .13. 2.333.0180. , a JV . 9.x...g {tel}: . JI......I.1..~...:.. 2 .. o..I-‘...z&‘ ...‘o.3ll:. . 07......3 ..9..!..0C20.3!89.. .52....- Iu . ......Ilt. ... . ......tono ..C ...:l‘: . ...-"1......octvu’il o. 2'. . v. 0 2 .. o. 2 2 ‘v ‘.v4 ...... 0C1’521339. .... 2 1.. .5335... .....Jb. 1". ‘9 .(H \ 7»... . .... 33.21.. . . ..l .1... . «I . ...r‘ 3733? |.. a.‘7....1. ...~ . ... ; In..." a ...... 8.3.... £0....Mu.nfloucu..o‘.unl...nl . . . : . . 2 . ...‘I. ~60. 3‘... . .9 .... A. .3 ...-o o . . . _ ......on...... ..v 09.031.21... kin... ......2 :2 . 2.2.21.5. 2252......» .flhfii. .....1’? ‘X’I’v‘,:' .. .o ...!O.v.to~......‘1..!. ...... ...-n .1... . i 3.“..«3‘. ......- ..}.~".3..g¢x2“p3‘lo.l ..9 33¢ PI. p.:..r.vt‘.m..o..l.Q. 3.8083(73flfi3831 .uySJHB. ck. .. 3.1 8 I. 2!. . 2!). I. J...- t 2. . . .2. 9 O . . on.“ Jot-"r. u a“? I. ”- «won-:13... ...- ‘hrlohuz 2 . .. , . . 0.7.: ......I... T 2 o. if .JP..02. _. . .... . .... .3... . 2 . , I at... . ... ... .- «ulzittirfio 3... {...-J”. .Qr‘vwliJ..o‘u‘. ~l;.f4\5=0|..'1 ......lo... #123... . . .. . . 3.0%.“- J .23Hu¢vn..u.,.t.... . .... 22.22.22! 42.2.2... . .... ni»f....u........ and... .1...2..Hz£.2a.........:........n....... ..u...f....r..:...:2...“ 22.2: :ufiffixudur 2.3.»... ......fi. 29.2.... . :22... 22.222.323.31... .. . ...2 ....2... ......n....,,22.....2..2 . g I a .s .1) . IJ: .9. . 253..l.'2.. .. J :8$£...J.......I..~.::8 1...... .2818... o. . .. P tn ‘4 til, ”A . . Mum— . , . . U? z . .J‘x 2,. 4' r ..,._.5. . .... 25...... . .. ...... . .... 2 LR?“ five . .... ... . . .... . I’... 2 .... ... . £197."! ‘7 o . .4.n . 1. . u 1 .909...l:3!q!.w1.“_..o....n9....,..vu¢ .95. it”. t I. w. I} 1.. V ..l. o ....037 2-. ..Qc. .J‘ét’. r , ....\,.r.hl v61. 0.9:...Xllzo. .0. l . v. .5. {51.3“ 9 ‘.vt...§ol..o..&.? “...-"h“..nmntfi .... . I ..¢ .312! ... r S! .1 .. . ‘34PJHH‘XI‘L; onivHIJC‘“ .... 3:”...53i9 lh.‘..'¢o..\...o. 1‘": .....nnoum-ug.‘ «3‘. 7 . .61.". . . .1 18.23.349.3- 1: . : .0800}: En“... ...fl“..1.t.r..3.:..: JO}. ...... 31.0.. 29:" 0.... . u I; . ...-3.215331! 0. ..a . . ....tcleza... u ...... :t‘ 3:...“ ... . . o . 3 .JJ!..' .o“’?c-..u..g?ll :2. .....alcoo........ tl!..‘3:1..‘......u...o. ‘- J Unit-“run 17’»... . 9. 39 o. It... . 1": nu! . ... . o .... . .... . o . o. a I! o I. o o . 2 ... ...-S . 2.. . .91.}... ...-.3 . . . ...-.30....2921. it... "I. 0.. ... ‘Ioo.p..o.fi r Q .351 I. I." 2 .00 2'. o " . OKNRVCVUooQ/3.tla 2 9..(sv9..nnocolo Q'l . fiftietijfit: ......)u...‘ .. nt§1lf o \b a. v . ..a .aom‘owuvr.owiu.ntxg . at)" n. i .o $.3'. ..lu. . . . 'tloo‘n.‘.~n..l\ “r, n :’-3‘ :85}:........ 3.2.0.. ...-r . . - ny: .- = .u . . . .4 ‘1... .K.hfipqnhaun.d".odooouuhuno‘d 3firJ..v-flot’odxuor”fihunnzro 1‘ '0... . Ito'a. l. 3‘. .mqi-p5f... Q n... ...-‘03,..‘v. . u c . .n n . . J a . .3 u . .0. "r. ..9. ... f3..fl.m.f.:.. ..rfia. shaman. 2.25.1.3... 2.61.1 C2”. 222...... .....2kh...n.........2n.......... 4......» .....2. .13.. . \ J .... f Y“: IN». 2171...} o. 3:32....- I ... . . .il!.o.\.c$t¢ ... ...... 1.1.3.8.... . 20.0.32; 0.... . 0...: .l. In|....vlp.oo..o|.ol.tl..n‘ . ....V. 9t. 4.. . .. . 32.31.0991,?! .9 . .. . 0’... . ..12!..J.I. .9. .... AL... 0 30.71. ...-0.39.? .2838... I) n.!‘. 93...?! . ....IYI’S..9..¢.3X\I .i.’.o!l§\.‘lo¢“’ . . 2 C... ..l..f§;.-..4.'v.3. "'9.1v4 .CC’. .... C...‘-..ow'uv : , .8 . I O ...-1......00 ...... . . _ 1...... ..00. ott...¢..u!v 1.23.03. 0 . 03:3... ......no.... 2‘99. ... 9...... J‘SV’. :62... ‘rvo‘ |‘.I‘P‘.‘ .70.. ‘u' .0: 0| . 3 .— I. “!.f a Z. 4 .s . ...“.tflgo' ...cc‘v‘lzr."'.i.v§.’li ....a‘x.o..-III‘9.:A .‘I’o.!'.rv‘1: . ., ... u... . O .4 790'.- .“..33‘..J‘ 1.....3. 0.: 0‘1. .19w .1..- I.’\)I.O - n . 0.... ....v a. ..1'803. a. I... o ....9. . 1:7,..‘1‘?....7.......€T9o '7‘25521'E‘21§?..?..1v.,...: 0c. . O. . 50.1.2... 0...! Ogll. v I..‘¢|..O¢3- I 2 . 7....t .1....2..."!eo.3. .‘o 313;... . c..' 2:... .2. ...? ....v : .... is ‘ .....3..........!!.’19... ...: .. ”Mn- 3 .0 :2. 0.5.22.1“...5? 28W :8: 3...; ph“. 6.5.53.3, ~ 20.211!ka 9.3.3.0...‘3 . ,. u" 210.313.... .....2..',L..8... ..9 .... .... .o ..‘..l’... I 9.. inf-......l ‘3‘}! 0‘... u. ‘§7Io‘ooz‘ 3:08..- .‘l... 3.\.£.ICJ.0..H3.¢2.Q. In..." ’3. .32.! I. ..~ .. v. . ...-6:119: .o . s‘a ... n 0. . . \ g 70. n‘ . ' '9 b: 0.99:0... ‘70‘.§::. II: I "0‘ "§..l‘.3numnuu a..." “......1‘033 .38“... .- 30.0.0.1... n 2"... ..J-C. I... a!....l.\.~..o.: .1. ...}..oo o 0:13.... a. u.‘ u. ... 60.0... .....a9.v..C?o.. 1......- . . . . I t- v. 3:)... 12.22.9933,...nv .8??¢...S.‘Q$.., an. t ,. ltuo...t..o 0.1.2.. .3334! c!!..§7..fl..l .0... fi! ...0: 2.x. .Klt. on. C... ...-0.9.... of. I 0.09. ain‘t... .E....O‘..I ...... . Q c..‘.?o:¢~l‘t"..f u'1. Q‘OXO’.‘ . . '3‘ = .| ,o'. 35...! ...’..).w. 12.. ...... :70..V...... (.58.. ...“.3- 3 “HO. n "yogi”...oun'fi'...‘ ..3.u.uu....-..ll: ... ......5... 27331133....03: ’0‘. :l‘ 213.....V53Jh.“ . ..2‘ .3“- I.-.."’ "auto-urti‘... 133.8... |’o. .1 ...: I I... ....tho... .. .. l2. ,7 c. .9. 2.0:,“ ...vlhutgflkhuochr’fluTt‘aQOon-t. . . . a. . . no. I. c. . . .0.- .4.‘ . . . z 0 1.1.... I. a. . I... o v. g 00...“..30‘.’ '63... o .31 .o .- 1. cl..4c 3:2 9. . . . . . ...ufnp‘. ...—91"... ..v 2' . . «$990.21.. ...L‘SIQO?‘ 1a.??h I...L....Apm‘v....9‘ls . 0v... . . . 1...... 0...... incl. . ....c ...-v... 1 . . . . N . . . . . . '..A...‘......I0‘4":33.34.615.3‘ ‘ O~.'.‘.G g. | ...‘Jv.oo.0o~.'?.: . . . . . . . . . . . .03....‘1 ...}...u’ A00... ‘:0 .....‘Qo’vfi‘vzc..lo» s0.’ . .... 90.8.1... . . . . ...! Q. ..I1.!..IIC3T..... .... . r . . tto‘l 0.9.2.. . . . ... ......tfl... . 97....- ..‘.1....’tu 3 Q ... n. ..o’o<'ZD’I.C: .. 3 lo... 2 ‘2 Cl 0.... 3‘ . .x......o.::r....ag....:...3.539}... 9...: ...}...Q: ...-92¢.Isn...‘ .’0.‘!n ...QI‘ :0.‘...0’..'~. . . . ...; ... ......ou. .3 2n..\%.‘. " . ...!..¢=o.'ul:’..:" ‘3‘. . ‘fi..."‘s"o . .... {...-...»... o‘.v.‘.... _ . .. 9.3.... .... {iiv-h‘n-v‘o. no 1.0.3:... . 0.3. ...... A I“... .l . . n t _ ...-.... ...-I. 4 . «2 . «93‘11...r‘fli‘.3. In I... .11. (Quiz. 3 .3; t .0 .u . ‘ o .I, ....0031 .3. o r .0 ...... no 2910.....9‘78. r. 9.3.4-.. .07 ‘U’fil...~ ...-...». ... a“... OI. 30...}... ‘1‘!!.«no!l.i ..3!..o.:.us o. ..-. l 7......3? .1 ...-......q. . .2 .Iv 20...... a. I‘: .o .c . I I.“ 0“ 4‘!- ...l, C)... brain. On. 0 l0. ...... ... 0.. s I.) . 270...... 0‘ . .. . .47.! ......Q . :9 O V 0.0.... Q.- . ~.. . 9.... 1.. ’OuCl. 3.....93 93:0..- .. ...... 133570 0- ...... ..Q. .- ...O . .o 30.0.2... - o 2.. 6: ....I 5.17... Io... .1 . 0.00. ..l.......:. r. .. . ...I .2.- I. ......z... .... s‘ . I. 2 .01.... 102.... .99.. -... ......o‘.... 433).. 0‘01..895..!..3 91‘3‘?‘ . . I... .... ...‘o.‘........ ..I .390-.!."‘-.§\I0‘La3!$‘ . .. . . . . I.:.....$....»142..! .0 9...! 2‘... I «9.... 0.9.5:? .92....9 L1d 1.)! 9 S , .. o)'l. ... J... . .....1 ... l....k-V.t.|.0..!9:¢ 0...}... u .2... o. '3‘ 0:.(‘1l’oo .5. .10.... -20... .20.. c '9'... ..I...¢>...?TYC.Q. 'A ...-20.! . . ......O. .0 ... . .1... 9... . -.. DO. . n $111.31... . ...-1‘! C. $5.563}; '90....3. ‘ ’5‘ "v‘. ..D.‘ ‘.‘0u...‘h:.t.‘r ......Q .v‘l.‘..u. ....) o..1.'9:. . . 9. to. 3’0. .. 21... 7.... .o.‘ c.00..'9.0x¢vl_. . . . - ....JI. ....f.9..‘......... . . . . .. . q . «.....ICJ'... 13". z i. - .. . ‘9‘. . .. . . , . .. . .. . . . of... :1 3‘8 . ....v ....I': .951 . . . ._ .. . . .. . Foo... . o _ 9...... .. . . . . ... . . .. . Qt. ... f}...’...r...’~ 30.012P._1\1|1!. . . . . . . . . . . ...!!!2..3.2..2’V)~.....'..I. Lq .5 . . . . ..i.‘.9'.c'..w..‘9!£ .3 t . 3.. . ...... Kb...!!|57. ES .... \ 1."... . . :1....ao.....3:9. is... 9 I. 9.2..-.13-‘1 olol‘tv‘t out! Q . . .0 "..D’SOW: ::..“ ......o: v ‘20. l. ‘ .\.....!"9....“ .31.... .... . 7": -... . ... 0.9.72.2! <1... . . oCbJ.1.u rnvt. . . . . 3311.. ... ....‘Q‘ n .. cl--. .I‘. . . $5.03!»... .. (0‘09... . . ...-ah... . . . . . 4.. . ... 0., . . ..‘. u 1.10.... o. . I . . , . , . . .92.... . - \‘PquDJIQOHQ . . . . . . 2. . ..02...Ot.oon.’.uo .$... ......0\ .1.) ...... o . . .....o:'..o:.$.v.vo.‘.o .. p‘..£_ 2.1.1:“. . v.19... . ‘9‘... .. .. O. . .‘o4v.:.,.4l. . . . . . .‘ Lhr‘riunt}.uo.vo 2 {a ... 115.. o ‘9..- O.. . ...d.. y . ’<~"0.. . . ...; ... . .....lo. O CO. .0- . 9|. 0... c..t.b.!..o ’ 7. ...: 72...... . I .. .'u ... ‘ . .9 ..v . Q.votb.’.0 . . .... 7.2.0.0. (.031. ......‘..I..-.. 5...... . . 2 . . . . . . V .. . .. .. . ...... .-.... . .... .2 . In... 0. -..... .‘I...l v.90 . In . . .o ....x... . ... . to... .. ..J - . . ...... a... .. 5:1... . . .94... .00.... gt... ...-.... . . 9.. ......9. 99:... c . 3.1.9.... ...) .... . o .. . 9 . W272}. .513 .- . .. ......‘oosé.’ ... 0.. . . . .0 ..O“...:. n ....K 0.. . O . ....o .... . ... . ... _~ 0“. ...... . ..o ...: .. w . v we... . L... . .... . .o . . . a. 0...! 0.. ...... e. ... 3 no! 9. up... 3... Q... .. 2. I... .. . .0 ...L . . 2 0.0.0 5... .. p o . 124‘. ...... . 2 . .. . .. . ... I...l.0 ..v. . . .I... . ...9\ ...-v .. . ...-{2... If -.. l....!....’.. .. . 3‘4... . .....Q-l - o. .‘ 2 . . ll . . . .... 7b to ...-.9? . . . ........... .... L .v ... 2. 1.. ... I . . ......o... 2. ..lo. . . . . . , ... . . . 9| . . . ..‘a..".fiu‘ 'Y‘A ..IOi‘...’ ’9‘ .2 . 122119.... a~yicliotdwn :65- ”!Ptfi’i...‘ 1!... 90.5.}. fowl”... . . .3 ..O .2... 0.9.3... 9. 0...: u. C. i. . . n . ... .dz . . ... ... u o 0 a 7 . .935 . o. . . . . . ,5nv355. i , . . I o 1 14....I!.. I319]. .0......-...._ ~,. .. v. . ..1 .53.... . . ...... . . . .7. I. 410.11.. .. . .. . O. 2 ~ .. .... '33-..1 . . .1. .7 v. . .....Ail . . L . . .o .. ... ..I. ..v.. . 2 .0 n u. .9 t - . . .. .t .... .0... Q.- .. ... ... I . 1......9. . 7."...- . . L......¢ . ...... 21...-.1... .0. . 03.0,. . 0......3...’ .97..s.¢.’\ ... .u . .14...’ . . .9.4 2... ..Xoilu ... v. .70. . .. .2 ......cv. I... . . .. .... ........s...c........vo....b .1._....-.c..o-.. ...O'A...... ....9..0o.. 2. "...-2...... ..l... .. I. ..v.... .d o . . . . .-.... ... .t . .......... .. :.‘...:..Ic 2"... .J.‘I.o..n ... utd . 10". ' n .1 ...-O . 30.0.3307. Cl. . ‘00....9. I. v . O.\.')Oo.’~.l 80.115.90906. I: .0 0 .\l 2.5.0.0.;916‘I‘ $0 ‘ l-‘ ...-I. O... ...... ’9! ..'. ‘1: ... . . . u v I. 3.2.. . o o u.-. o r; .....‘Q:'.4..’-.to 90.. 0.04“ «.44. v 10.... n 91.3.... .... 4 '00. {.IS‘. '5.- .. .... . .. a ...? .. ...I. .n.' I. .. .. 0 .-.. . .........~u........o.-. . .. . 2 . ... . . .2 ... .s . I . .. 2. 2 . . . . . . .... o. 4.. . . . . . .§....b.‘.o‘:n42.v9.}’0’...3“II.. 1 o o . . . . .. 3 ...:cl-. ...... ‘1‘...o.¢.1n‘¢6 I ... ...: .12.... .‘I.. .. Q. . I. .O‘Olco..0u.QOoooo‘n...”..‘i’t‘l . V ...... I. I Q.- . \ . I ‘n a u e I .- 0 O .9 n ... .1.‘. Q‘. A K O i ......v. .. .5......‘. .t"l (...... v ..‘AI up ... ...- ... ...? ...-3 90.0....Y..... Q‘QRC.QO.AI.-o'0’0: 31'. t... ...: .... .... . . ... . . .. . . . . . .... .I'. ....Oa... ’35.! .- .. ‘. I AC .0. 2o. .0 u. .. Do. I . . . 2 . .... .. p ‘ Vi 2‘ .00. .151..." "“‘I. .2‘270 ‘O I..‘ ..... ..... n . ... . .o . . .. ......O .. ...-...v. a. on tooiLotc..--05(91505.I‘..l.‘ . ..1 19.. . 2 ..L .1. . ”I! .. ......uq. .lowo ... . o....1040!cf..;.t’§a.9.(‘3‘ .....Q. ......o... . . . . . . . ......2.... ... . 2.. I .0. ...... ...... . 2 .V......Y¢...¢02E.39 'Il‘l|l‘ . A .... . . . .. , . 2 . . ..u.. .. L..._. . . ...-...o... A... ... ... . .2. I... ... o , . .5... U4 (1”,. o." v.. vocal 900. I'I. I . , . . ... .. .... ....At ....l..k. . . .. ... . .... . ... .....f ....u.!.. .u . 2 . ...‘a... .0....Q.’.os.|pl.o.... la‘q.‘d":a.x-oo.vl . .... . :- ..”o.nl .2 ... ... .... ...4. ........O....O.. ...§ 6...... . ‘Ol‘o.l.QI..1,l.....6‘.t~§6.‘ A ... . .u .10.... . . . . ..o.... . ... .. ..... .... ...£.-Ql.-.Y. .. .A.O¢.h.o..0...b .Ol-t..\PfI$QO-Id.l.uo.vl 1 : vi. . O . .u ”6.: onvo ...... . 2 .. . .. .. .. . . . .. . . q. . L . .... . . ... u o. . . . . . .n . ... . .2 . ... . . I . ... .. n o o I... 5.. .350...‘ 3...: ‘I...I‘.§\a§~§ "3’..LO.‘to.-.O I. ‘...n....vvfn.o... .4. .21. o .. ..O . o . . . . . 2 . .2 : ... _....... . .1 . ... .24... a ....) .w 4......U‘Qu. .39.-..-un’-.§c‘.-Ot.‘.ob..‘ N O ... 3.4.... ... ....O . . v . v ...O ..1 I o. a . . .. . .....Ylh. ....’. 7...: . a... ... a.w. .3 ...VO \s.9r.......‘K...lb"-‘O&v.raI ‘2 .... ... o l......., ...I‘I. .... 2 . ... 4. . ,. . ....v. , . . . ,..... .00.... 963i. . ..... ...... .... . X . ... .. .. .. ...! Q ... .. .22.:0w4 . .... .l. . . ‘0... ......1‘ 2. .. . . . ... .20.... ... . o. . ... . . .. 1...... I 4 I 2...- .2 .. ...o ,- .. ....l.... . .. . .9 .... v: ...-....I. ..n .1... . . ..- .. . .I. ..‘o . ... o. o .. . ......oo...o_... to. 0.2:. - .0 00.. u. .. .5 o. ......l to. 5.0.3.... .. 3... . .v C \u: ... .....v....2.. ...... . . .2 .... .. .... ...-..... «.1-.. . Q .1 .. ... .0. . 0.. . . 0 ch..- o . oIL. .......‘....l O ‘n...... C \ I... o - I... or .... ‘0... . .. .0 o .l .0 r. .! . ...... a .o ....s. . 0;. ... ‘ v clot-.0 00. A I . ... .. o ... I...I v ..o‘....-. ... ...: ...l . . I .. . n. V . v0. .... . . .2. . .... ... .. o..’ .0... ... a... ‘ 1.. . . n n.... ... I ...... . u .. ... .. o . n .2..: o... .v. 0...- I 31.2.. ....90 2.05 .q 1.. .ur . . I! ....c .. nacl. . .o a... t. o. o .0 . C ... 4 ... no......boc..o .. . up“ _. 3 . v .....0' .. . . v. ‘ a... u. . 2. A ...:o- .. 1......07 o..~... ...-ll. ‘IQ 4" I. .I . v6 0 . ...... v 0. ... 4 .0. on‘l A. .....IIhL..‘-\ ..o£\. 0.... 4 o‘. Crl . . .0 an... .99 . ... t. .... a .l.t.l.0.l. ......l 3.8.51}; ‘3‘. .4. .. c ...-1.0.: a. 1.1 _ . . 2-. .. .. O ... ......- ‘u. a... u. 0.... o . 2......‘Qo .. ..2 . t3. ..4...Oudu.n....m.."oa ! m at. t - wig..1 f. ‘11.. ,H‘ ( \Qa. |\ n . o 2.. . . n. 0. V O. .2 2 2 .a. A...o.£¢‘ :0" ... _ .. x . . - o u . . .. 7?. .. C. ... II. o 0.0 ! o 1 . __ . .. 2 a . . .. . u v. 0 Q! . o . . ...:u ......" ..HIoo "inn..." . ...N o ....w. 3... .... I ..l mun-83:1. tauntunfintt... .... (pig. . Infili. .0. . . . ‘1 A. . I . . . . . 2 ' . .. o 2 . . t . . . . . u. 2 1-13.... {68.2.2'... .93 ....3... . ......3..‘o08. If... .fiozflsx ”I: . . fig 00:35.». 2.3522 6. +13... . . 2 . . . 2...... . . . . . .. . . . . . . . o . . o. . . . .. . 55 t. .... .....l... I... . ......2..vbo.. 431_p....|..... ...... 8:v$; . it... on .o I. .... lv.-..t‘ .35 ...... ...!ch i .«o . n... . ... o .0. . . .u o . , . s a . . c . o . I . .. .... .4. . . ...:5....:?.....:.A‘ ‘0..T!.:‘;l.a333a0..t.‘¢acr 08....QQC‘r . .... “p.52” . . «...: .ré. . . .0 .. qlh . .2? . .. o .. . 2 . . . . . . - 2 . . . o. . o . . - .... . ... «it . ....i... O.!.!..’ (...-Iii? l3. 3.... on. K . C .pror A...“ 42...». t u 0.. i... .. ...l A. 0") . ‘0 . o ‘ . . .2 . . o .. . . . o» a . Y A . .. .. 1. ...: . ...... . .I. .uslb. ...... 9.5:"...I...‘ ... ...... o. .!.:1 3 P . ¥ . q .. . . . . v o u ... V . o _ . . ...l.‘ v, cox. 06....- .. 3. 5 ‘ . f. . . .fl 9. 2 . . ... .. . .7... P ..v. ...-'0 a 2f"? w . . ... .. ..r. «0.... ... .0... . .- . . . . . . . I I-CJI .0. LIBRARY Michig; :n State University This is to certify that the dissertation entitled WHY DO SOME PRESERVICE TEACHERS TRUST DIGITAL TECHNOLOGY AND OTHERS DON’T? CONCEPTUALIZING THE INTERSECTION OF TRUST, TECHNOLOGY, AND EDUCATION presented by ANDREA PLOUCHER FRANCIS has been accepted towards fulfillment of the requirements for the Ph.D. degree in Educational Psychology and Educational Technology r/ \Ma/jor Professor’s Signature Lava 2 71 20/0 Date MSU is an Affirmative Action/Equal Opportunity Employer PLACE IN RETURN BOX to remove this checkout from your record. TO AVOID FINES return on or before date due. MAY BE RECALLED with earlier due date if requested. DATE DUE DATE DUE DATE DUE 5/08 ICIProi/AchdeIRCIDateDuaindd WHY DO SOME PRESERVICE TEACHERS TRUST DIGITAL TECHNOLOGY AND OTHERS DON’T? CONCEPTUALIZING THE INTERSECTION OF TRUST, TECHNOLOGY, AND EDUCATION By Andrea Ploucher Francis A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY Educational Psychology and Educational Technology 2010 ABSTRACT WHY DO SOME PRESERVICE TEACHERS TRUST DIGITAL TECHNOLOGY AND OTHERS DON’T? CONCEPTUALIZING THE INTERSECTION OF TRUST, TECHNOLOGY, AND EDUCATION By Andrea Ploucher Francis With the increased availability of technology to teachers, it becomes important for researchers and educators alike to understand why teachers choose to use technology for educational purposes. In this study, I use a weak version of the Computers as Social Actors (CASA) hypothesis (Reeves and Nass, 1996; Nass and Moon, 2000) to extend the concept of trust in human-human relationships to trust in human-computer relationships. I review and extend conceptualizations of trust from multiple disciplines to trust in educational technology. Since trust develops over time, preservice teachers who are still developing their relationship with technology were studied. An instrument consisting of vignettes entailing different levels of risk, designed to measure preservice teacher trust in educational technology, was piloted and found to be reliable. Then, 136 preservice teachers in a large midwestern university completed an online instrtunent designed to explore the following issues: how trusting people compares with trusting technology; whether the level of situation risk affects how much preservice teachers say they will trust a digital technology and whether they report they will use a particular digital technology; how demographics and individual skills (gender, race and ethnicity, age, socioeconomic status, what level the teacher plans to teach, what content area the preservice teacher feels most equipped to teach, and computer self-efficacy), ____,_ _ _ psychological traits (how trusting the person is in general and how risky the person is in general), and experiential factors (experience with technology in classes) contribute to that preservice teacher’s decision to trust the technology; and to what extent and in what ways the explanatory variables for trusting a digital technology also explain what a preservice teacher reports about future usage of digital technology. A combination of results from scale reliabilities, a correlation analysis between general trust scores and trust in educational technology scores, and a review of the literature, suggests that trusting people is conceptually similar to and different than trusting technology. Trusting people and trusting technology both involve consideration of the reliability, competence, and honesty of the trustee. When trusting people, benevolence and openness are also considered. Qualitative data suggest reasons for teachers’ use of educational technology were technologically and pedagogically specific, but not influenced by the risk level entailed in the situation. Trusting technology seemed to be based on both the specific technology and the risk level of the situation. Regression analyses found that participants who were more confident with technology and had positive experiences with digital technology in their Teacher Education classes were more likely to say they trust digital technology. The more preservice teachers said they trust digital technology and the more positive their experiences with digital technology in their Teacher Education classes, the more likely they were to report they would use educational technology. Implications of results and the use of vignettes in instrument construction are discussed. ACKNOWLEDGMENTS Not what we give, But what we share, For the gift without the giver Is bare. ~Jarnes Russell Lowell Without the guidance and support of my committee, family, and friends, this dissertation would not have been possible. Six years ago, while I was still in the Cognitive Psychology program at Michigan State University, I began working on a project with Dr. Punya Mishra on how children interacted with robotic animals. At the same time, I took a Cognition and Technology class from Dr. Matthew Koehler. Inspired by both the content and the support I received from both Dr. Mishra and Dr. Koehler, I decided to switch to the Educational Psychology and Educational Technology Program. I have never regretted this decision and am extremely grateful to Dr. Mishra and Dr. Koehler for continuing to mentor and support me in my work and in life. I would like to thank Dr. Rand Spiro for his usefiil suggestion of using vignettes in my instrument to help capture the complex nature of instructional technology decisions. I would like to thank Dr. Nicole iv Ellison for helping me navigate the realm of online surveys and suggesting ways to compensate participants. While not on my committee, there are a few more people that I would like to thank specifically for their help with this dissertation. Working for Dr. Raven McCrory has been an absolute pleasure. With her project I have learned lessons about instrument construction, navigating university systems, and analyzing different kinds of data. I thank her for constantly being there to support me both in my own research and in my personal life. I want to thank my step-father, Dr. Dennis Luckey for helping with the statistics in this project. I would like to thank Anne Heintz for editing this dissertation and I would like to thank my mother, sisters, and friends for their constant emotional support. Last, but certainly not least, I would like to thank my husband, Dr. Jeremy Francis. On numerous occasions Jeremy took care of our baby, Alex, so that I could work. Thank you for not only being my best friend through this project, but also being an invaluable source of ideas and feedback. ‘.‘. TABLE OF CONTENTS LIST OF TABLES .................................................................................... ix LIST OF FIGURES ................................................................................... x CHAPTER ONE INTRODUCTION ..................................................................................... 1 CHAPTER TWO SURVEY OF THE LITERATURE ................................................................ 4 Digital Technology in Education ............................................................ 7 Resources ............................................................................ 10 Knowledge and Skills ............................................................. 10 Institutional Structure ............................................................ 10 Assessment .......................................................................... 11 Attitudes and Beliefs .............................................................. 11 Subject Culture .................................................................... 12 Pedagogical Decisions ............................................................ 12 Digital Technology and Intentionality......................................................13 Conceptualizing Trust in Technology ................................................... 18 Moral Models ...................................................................... 21 Rational Choice Models .......................................................... 21 Social System Models ............................................................ 23 Relational Models ................................................................. 25 Discussion of Models ................................................................ 28 Extending a Relational Model of Trust to Educational Technology ................ 29 Vulnerability and Risk ............................................................. 29 Competence ......................................................................... 3 1 Reliability ........................................................................... 32 Benevolence ........................................................................ 33 Honesty ............................................................................. 34 Openness ............................................................................ 35 Conclusions ......................................................................... 35 Purpose of this Study ....................................................................... 35 Discussion .................................................................................. 37 CHAPTER THREE DESIGN AND LOGIC OF VARIABLES ....................................................... 39 Dependent Variable: Instrument to Measure Use of and Trust in Digital Technology ................................................................................. 39 Pilot Studies ........................................................................... 42 Situation Risk Assessment .............................................. 42 Test-retest Reliability and Scale Reliability.............................46 Predictive Variables ........................................................................ 47 vi Demographic Factors and Individual Skills .................................... 48 Computer Self-Efficacy .......................................................... 48 General Trust ...................................................................... 49 Risky Behavior .................................................................... 49 Experience with Digital Technology ........................................... 50 Conclusions .................................................................................. 51 CHAPTER FOUR METHODS .......................................................................................... 52 Participants .................................................................................. 52 Gender and Level Preservice Teachers Plan to Teach ........................ 53 Area of Interest .................................................................... 54 Age .................................................................................. 54 Race ................................................................................. 55 Socioeconomic Status ............................................................ 55 Early Versus Late Responders ...................................................... 56 Representativeness of the Sample Population ................................. 57 Instruments .................................................................................. 57 Study Design ................................................................................ 58 Procedure .................................................................................... 58 CHAPTER FIVE RESULTS ............................................................................................ 60 Trusting People Versus Trusting Technology.............................................61 Teacher Trust and Reported Usage of Technology .................................... 63 Risk Level Analysis ......................................................................... 65 Qualitative Data Analysis ........................................................ 67 What Factors Contribute to Teacher Trust in Technology? ........................... 70 What Factors Contribute to Teacher Reported Usage of Technology? ................. 77 Path Analysis: Trust, Teacher Education, and Reported Usage ............ 83 Conclusions ................................................................................. 89 CHAPTER SIX CONCLUSIONS AND A LOOK TO THE FUTURE ......................................... 90 Lessons from Instrument Construction .................................................. 92 The Importance of Risk Level and Pedagogical Philosophy when Conceptualizing Trust as a Behavior Versus Trust as a Belief ............................................ 95 Creating Positive Experiences with Technology in Teacher Preparation Courses ....................................................................................... 97 Trustworthiness of People Versus Trustworthiness of Technology ................ 102 Limitations and Future Study ............................................................... 104 APPENDICES A - Risk Level Pilot Instrument .......................................................................... 110 B - Preservice teacher Trust in Digital Technology Instrument. . . . 1 12 C — Consent Form ........................................................................ 124 vii D - Prenotice.. ..................................................................................................... 125 E — Request for Participation ............................................................ 126 F —Thank- You/Reminder ..................................................................................... 127 G —Last Request ................................................................................................... 128 H - Reported Trust Item Descriptive Statistics .................................................... 129 I - Reported Usage Item Descriptive Statistics .................................................. 134 J- Open-Ended Comments ....................... 139 REFERENCES .................................................................................... 161 viii LIST OF TABLES Table 1 Breakdown of Gender and Level the Student Plans to Teach ....................... 53 Table 2 Breakdown of Interest Area by Gender ................................................ 54 Table 3 Breakdown of Preservice Teacher by Age ............................................. 55 Table 4 Breakdown of Student Race ............................................................. 55 Table 5 Breakdown of Preservice Teachers by Mother and Father Education Level. .....56 Table 6 Breakdown of Preservice Teacher Mother Education Level by Gender ........... 56 Table 7 Descriptive Statistics for Trust and Risk Measures ................................... 62 Table 8 Zero-order and Part Correlation Coefficients with the Trust Dependent Variable ............................................................................................... 76 Table 9 Summary of Hierarchical Regression Analysis for Variables Predicting Preservice teacher Trust in Classroom Digital Technology ................................... 77 Table 10 Summary of Hierarchical Regression Analysis for Variables Predicting Preservice teacher Trust in Classroom Digital Technology in High Risk Situations ...... 78 Table 11 Summary of Hierarchical Regression Analysis for Variables Predicting Preservice teacher Trust in Classroom Digital Technology in High Risk Situations. . .....79 Table 12 Zero-order and Part Correlation Coefficients with the Reported Usage Dependent Variable ................................................................................. 81 Table 13 Summary of Hierarchical Regression Analysis for Variables Predicting Preservice teacher Use of Classroom Digital Technology ..................................... 83 Table 14 Correlation Matrix, Mean Values, and Standard Deviation of Observed Variables (n=136) ................................................................................... 85 Table 15 Covariance Matrix of Observed Variables ........................................... 85 ‘ Table H Reported Trust Item Descriptive Statistics .......................................... 129 Table I Reported Usage Item Descriptive Statistics .......................................... 134 ix LIST OF FIGURES Figure 1. Trust and Reported Usage mediated by level of risk in the situation ............. 67 Figure 2. Preservice Teacher reported use of computers in education classes .............. 72 Figure 3. Preservice Teacher experience with digital technology at MSU .................. 73 Figure 4. How often Preservice Teachers say they will use computers in their future instruction ............................................................................................. 74 Figure 5. Path analysis results of the Saturated Model (Model A) with both an indirect relationship between TE experience and Reported Usage and a direct relationship between TE experience and Reported Usage .................................................... 86 Figure 6. Path analysis results of Model B with an indirect relationship between TE experience and Reported Usage ................................................................... 87 Figure 7. Path analysis results of Model C with a direct relationship between TE experience and Reported Usage ................................................................... 88 CHAPTER ONE Introduction In Mendota Heights, Minnesota, a fourth grade teacher has students use educational applications on the iPod Touch to learn about vocabulary, geometry, and geography. The teacher reports that the students are very engaged and one student says he likes learning with the technology, “in part because it’s ‘something that is more newer than paper.’” (January, 2010) In New York City, about 6,000 students in 22 middle schools received laptops in 2005 as part of a $45-million, three-year program financed with city, state, and federal money. However, Liverpool Central School District recently decided to phase out laptops a year early for several reasons. A recent New York Times article reported that the reason for dropping the laptop program was that the teachers were frustrated with the unreliability of the computers. Indeed, a room that was once used for the yearbook club became an on-site repair shop for the 80 to 100 machines that broke each month (May, 2007). In Norwich, Connecticut a teacher was suspended and arrested after pornographic pop-ups flooded the computer screen in her classroom. Several students saw the sexually explicit images and the teacher was arrested for risk of causing injury to a minor. The teacher claimed she did not turn off the computer because it was school policy not to turn off computers. A Times Online article quoted a blogger who wrote that the teacher was “framed by the computer.” (March, 2007). The first story relates how a teacher successfully integrates technology into her instruction. Several instructors at Somerset Elementary in Mendota Heights have students use educational applications on the iPod Touch, a portable media player and personal digital assistant, to learn about various topics. For example, second graders learn how to read a clock by comparing digital time with clock hands on their iPods. One could argue that not all schools have access to technology that “is more newer than paper.” However, many schools do have access to digital technology and yet teachers choose not to use the technology available for instruction. For example, in the second vignette, Liverpool Central School District was given millions of dollars to enhance education through the use of laptops. The school certainly had the available technology, but decided to phase out the laptop program because the technology was not reliable. The last vignette shows an even more personal reason for a teacher to choose not to use technology in instruction than the technology simply being unreliable. In the third story, a substitute teacher is arrested because pornographic pop-ups flooded the computer screen. Perhaps she should have turned the computer off or covered it, but this was probably a new situation and she was not sure what to do with the computer. Likewise, perhaps Liverpool School District needed more technical support. Nonetheless, these last two stories indicate reasons why a teacher might choose not use digital technology even if it were available - namely that the technology was not reliable and the technology was not competent. The cases above exemplify the controversy over using digital technologies in educational settings. As portrayed in the first vignette, digital technologies can lead to new ways of instruction and learning. However, if millions of dollars are going to be used to encourage digital technology use in the classroom, educational researchers must understand why certain teachers choose to use digital technologies while others do not. In this dissertation, the goal is not to decide whether educational technology should be implemented in different situations, but rather why teachers choose to use it or not. Based on prior research, some reasons for the exclusion of digital technologies include external factors, such as a lack of social and institutional support or funding and a lack of adequate training for the task, and psychological factors, such as a fear of using digital technologies and the inability to overcome “functional fixedness” or a bias that limits a person to using objects only in the traditional way they are used (Koehler and Mishra, 2008). The teacher in the first story overcame functional fixedness when she used applications on the iPod touch, a device normally used to play music and videos, to help students learn about telling time on clocks. However, none of the above factors touch on the personal relationship that develops between a teacher and the technology where that technology can actually pose a threat to instruction. Indeed, the Computers are Social Actors (CASA) paradigm suggests that humans do often behave toward computers as they would toward humans (Reeves & Nass, 1996; Nass & Moon, 2000). I suggest that another psychological reason why teachers do not employ digital technology in the classroom is that they do not trust digital technologies. In the second story, the teachers might not have trusted the technology to be reliable; In the third story, the teacher could not trust the technology to be competent and only display appropriate material. In this dissertation, I extend the psychological construct of trusting people to trusting technology. Trust develops over time; thus, it is important to understand how much teachers who are developing their conceptualizations of teaching (preservice teachers) trust technology. This leads to a series of questions, namely: What makes a teacher trust digital technology? And does trusting or not trusting technology have an impact on the way preservice teachers think they will use technology in their teaching? Since teacher preparation programs are the site that should be providing positive experiences and ideas for teaching with technology, they are a logical site to study the origins of trust, including possible explanatory variables and how they contribute to teacher us of technology for teaching. In Chapter 2, I begin by reviewing literature on digital technology in the classroom. I then discuss whether humans view digital technology as a social actor and under what conditions a person might behave towards technology as if it had intentionality. I explore conceptualizations of trust in many disciplines and connect the literature on trust with classroom digital technology by extending a multifaceted definition of trust in humans to trust in digital technology. There are five key research questions in this study: 1. How does trust in digital technology compare with trust in other people? 2. Does preservice teacher trust correlate with what that person says he or she will do in a situation where he or she encounters digital technology in the classroom? 3. Given a situation that entails some risk, does the level of risk affect whether preservice teachers say they will trust a digital technology? Does the level of risk affect whether preservice teachers say they will use a particular digital technology? 4. How do demographics and individual skills (gender, race and ethnicity, age, socioeconomic status, what level the teacher plans to teach, what content area the preservice teacher feels most equipped to teach, and computer self-efficacy), psychological traits (how trusting the person is in general and how risky the person is in general), and experiential factors (experience with technology in classes) of the preservice teacher contribute to that person’s decision to trust the technology? 5. To what extent and in what ways do the explanatory variables for trusting a digital technology also explain what preservice teachers say they will do in their classroom? Chapter 3 describes how this multifaceted conceptualization of trust was used to guide the creation of an instrument designed to measure preservice teacher trust in classroom digital technology in situations entailing different levels of risk. After the Trust in Digital Technology Instrument was piloted and tested, it was combined with instruments designed to assess demographics and individual skills, psychological traits, and experiential factors. Chapter 4 describes how the combined instrument was administered to a random group of 700 preservice teachers at Michigan State University. Of those 700 students, 136 chose to participate in the study. In Chapter 5, I use a Mixed Models Procedure, a small qualitative assessment of student comments, Regression Analyses, and a Path Analysis, to address the five key research questions. Chapter 6 discusses results with reference to previous findings and limitations of the study. Finally, I propose several future directions for the study. CHAPTER TWO Survey of the Literature From Rural American Teacher, 1928: “Students today depend upon store bought ink. They don’t know how to make their own. When they run out of ink they will be unable to write words or ciphers until their next trip to the settlement. This a sad commentary on modern education” (cf. Thornburg, 1992) In his book about technology’s role in education, Thornburg (1992) cites the above as a reason why teachers resisted the use of store-bought pens in education. We now know, though, that disposable pens are used regularly in educational settings and that running out of ink is not an insurmountable problem. Though very few of us know how to make our own ink, the lack of ink-making ability does not seem to have impeded education. Like the store bought pen, computers and other digital technologies are prevalent in society. With the increased presence of educational technology, it continues to be important that researchers and teacher educators understand how teachers think about technology and its role in education. Many factors may impact not only how teachers use technology, but also the circumstances in which they will use it. One possibility of why teachers may choose not to use particular technologies is that teachers may differentially “trust” technology. In this chapter, I show how digital technology available for instruction has grown and describe proposed reasons that teachers do not take advantage of digital technology. I suggest that none of the reasons capture the intimate relationship that can occur between humans and technology. I discuss how human-technology interactions have been studied and suggest that humans, under certain conditions, do, indeed, behave toward technology as they might toward another person. Thus, one way to understand teacher decision making about technology use is through the construct of trust. I review how trust has been conceptualized in different ways across many different disciplines and suggest a way to conceptualize trust in technology. Finally, based on a review of the literature, I suggest a way to study the potential differences between trusting people and trusting technology and whether preservice teacher trust in technology influences reported future usage of technology. Digital Technology in Education While there are many types of technology, this dissertation is concerned with educational technology. Educational technology is defined as the sum of tools, techniques, and collective knowledge applicable to education (Koehler and Mishra, 2008). There are two distinct types of technologies currently found in K-12 educational settings: analog and digital. Analog technologies are those older technologies, such as the chalkboard and the pencil, whereas digital technologies are newer technologies that use digital pulses, signals, or values to represent data in computer graphics, telecommunications systems, and word processing (like the computer, the Internet, cell phones, etc.). Digital technology in classrooms can be used as tools that can search and find patterns in huge sets of data (database software), mediators for communication (e- mail), tools for design and manipulation (drawing or drafting software), tools for artistic expression (movie and image software), and a way to fill various social roles, such as opponents or partners (video games) or tutors (computer assisted learning). Henceforth, the term technology is used interchangeably with digital technologies used for instructional purposes. Reflecting the increased importance of technology integration in education, the US. Department of Education has stated, “Technology is now considered by most educators and parents to be an integral part of providing a high-quality education” (US. DOE, 2003, p. 3). Based on the idea that children will learn about technology from their teachers, states now include technology proficiency as a licensing criterion for teachers (Zhao, 2003). The International Society for Technology in Education (ISTE) set of National Education Technology Standards (NETS) for teachers (ISTE, 2008) suggests that teachers 1) facilitate and inspire student learning and creativity through the use of digital tools, 2) design and develop digital-age learning experiences and assessments, 3) model digital-age work and learning, and 4) promote and model digital citizenship and responsibility. Most American schools today have access to computers and the intemet. In the past, the term “digital divide” in classrooms referred to whether the school had access to digital technology. However, this term has moved to describing whether and how digital technology is used in classrooms, just as much as describing whether there is access (Attewell, 2001; Kelly, 2008; MacGillis, 2004; Warschauer, Knobel & Stone, 2004). According to the Secretary 's Fourth Annual Report on Teacher Quality, virtually every school (99%) with access to computers has Internet access (Parsad & Jones, 2005). The Pew Internet and American Life Project found that roughly 87% of youth between the ages of 12 and 17 use the Internet (Hitlin & Rainie, 2005). Of these youth, 78% said they use the Internet at school. Lenhart, Rainie, and Lewis (2001) found that 71 percent of online teens said they relied mostly on Internet sources for the last big project they did for school. The US. Bureau of the Census (2003) found that 57 percent of all children in school ages 7-17 use a home computer to complete school assignments. Despite the increased access to digital technology, many teachers still choose not to use digital technology even when it is readily available (for example, see Hew and Brush, 2007). Many reasons have been proposed for why teachers choose not to use digital technology when teaching. Some reasons for the exclusion of digital technologies include external factors, such as a lack of social and institutional support or funding and a lack of adequate training for the task, and psychological factors, such as a fear of using digital technologies and the inability to overcome “functional fixedness” or a bias that limits a person to using objects only in the traditional way they are used (Cuban, 2001; Floden & Bell, 2006; McCrory, 2006; Koehler & Mishra, 2008; Zhao, Pugh, & Byers, 2002; Hew & Brush, 2007). Hew and Brush (2007) reviewed 48 studies that addressed barriers affecting the use of computing devices in K-12 schools for instructional purposes. They found six main categories of barriers: 1) resources, 2) knowledge and skills, 3) institutional barriers, 4) assessment, 5) attitudes and beliefs, and 6) subject culture. Related to barriers to technology use, McCrory (2006) describes five affordances of technology for teaching: boundaries, stability, authority, pedagogical context, and disciplinary context. When McCrory talks about affordances for teaching, she is really discussing the trade-offs entailed with using such technology. Below are the six categories described by Hew and Brush (2007) and how McCrory’s (2006) affordances fit in the framework. Since three of McCrory’s affordances do not fit well in the framework, an additional category had to be added. The affordances are attitudes and beliefs about non-subject specific pedagogical decisions, not attitudes and beliefs about the role of technology, so the category of pedagogical decisions is added to Hew and Brush’s (2007) six main categories. Resources Examples of a resource barrier include a lack of access to the available technology, a lack of time to learn and use the technology, and a lack of technical support to help with the implementation. Knowledge and skills Related to a lack of time to learn the technology, teachers may not have the knowledge to design and conduct meaningful learning opportunities that use technology. There are numerous technologies that teachers can choose from, and it would be impossible for a teacher to know all of them. Further, teachers may feel threatened by the vast amount of knowledge found on the Internet. Institutional structure Examples of an institutional barrier include a lack of interest in technology from principals and administrators, a lack of a continuous time block that allows for meaningful technology use, and a lack of planning about what to do with technology once installed. Cohen (1988) suggests that since whole-class instruction is used in most schools, having a few computers in a classroom can lead to management problems. A teacher may have a hard time giving individualized attention to both a group on the computers and a group that is not working on the computers. 10 Assessment An examples of an assessment barrier is related to the large allocation of time devoted to state-wide and national testing. This leaves little time to plan how to creatively integrate technology into the instruction. Further, the use of digital technology like graphing calculators is not allowed during national examinations, discouraging teachers from using them when preparing students for such high-stakes testing. Attitudes and beliefs For Hew and Brush (2007), attitudes and beliefs may serve as a barrier depending on whether teachers view technology as “a way to keep kids busy” or as being relevant to the curriculum. Some teachers in the United States see technology as a reward or way to keep kids busy rather than a powerful instructional tool (Ertmer, 2005; Ertmer, Addison, Lane, Ross, and Woods, 1999). Subject Culture When Hew and Brush (2007) refer to Subject Culture, they are referring to the shared beliefs and values of a subject-area community. This “culture” of the subject may serve as a barrier. For example, Selwyn (1999) describes an art teacher who believed painting with one’s hands required a different kind of aesthetic ability than moving a mouse to create a picture. The art teacher clearly sees the computer as a tool that interferes with the experience of painting rather than an extension of the aesthetic experience. McCrory’s (2006) categories of boundary affordances and disciplinary affordances are related to subject culture. Boundary affordances are related to the topics that the digital technology supports. For example, McCrory describes that simply having students “do research” on the Internet does not specify the content to be researched or the 11 spatial boundary around which such research is conducted. Teachers must decide whether they trust the boundaries entailed with the digital technology. Do they trust the program? Do they trust the Internet and everything that entails? Do they trust the students to know which sites to use for research? If the boundaries are clearly defined, such as the teacher telling the students how to analyze websites specifically related to a particular topic, then students could potentially gain a great deal of information in addition to gaining valuable critical analysis skills. Finally, the disciplinary context is related to curricular coherence, which includes developmental appropriateness, coherence of topics, and relevance and accuracy with respect to domain. A math teacher must trust the designers of computer tutorials to teach appropriate material when considering implementation. Pedagogical Decisions Pedagogical decisions are those non subject-specific attitudes and beliefs about the relative importance of different kinds of instruction. Stability affordances are related to how much the technology may or may not change over time. If teachers invest time in a technology that will disappear the next year, they may not trust any technology to maintain its stability over time. In this sense the technology is not reliable over time. Authority is directly connected with the concept of competence. For example, a teacher must decide the accuracy of the information on sites students use for research. The teacher must trust that the site is accurate in order to use it. Using an accurate set of information will likely enhance instruction and postitively influence the competence and perceived authority of the teacher. 12 According to McCrory (2006), pedagogical context refers to what extent the technology monitors, manages, and evaluates student work. What kind of feedback does the technology give teachers and students about learning? In some cases there is no feedback at all. In some cases the feedback might be dishonest or inaccurate. A teacher must decide how much to trust the technology to adequately monitor, manage, and evaluate student learning. Discussion of Barriers The first four barriers discussed (resources, knowledge and skills, institutional structure, and assessment) all assume that the teacher either does not have the skills or cannot control the situation. In this sense, the teacher is not actually making a choice; the teacher is simply acting according to a subset of knowledge within a structure of power. Attitudes and beliefs, subject culture, and pedagogical decisions do endow the teacher with a sense of free will. The teacher chooses not to use the technology because he or she believes it is not useful or subject appropriate. But none of these three facets address the more intimate relationship that often exists between people and technology. Another way to understand a teacher’s decision not to use digital technology is because that teacher is acting as if the technology is an untrustworthy social actor. The following section explores beliefs about and behavior toward technology. Digital Technology and Intentionality Even though we may not believe technology has a mind of its own, there is evidence that we behave towards computers in ways that we would behave towards other people. Dennett (1987) suggests that when humans must make a prediction about the future, they take one of three stances: physical stance, design stance, or intentional 13 stance. The physical stance is used when one applies basic knowledge of physics to make a good prediction of a system’s behavior. For example, if I drop a penny, I know that gravity, working as a system, will cause the penny to fall to the ground. The design stance is typically used for systems planned and created for a specific purpose. Dennett (1987) uses a person’s approach to an alarm clock to show the design stance. He writes: [O]ne does not know or care to know whether it is spring wound, battery driven, sunlight powered, made of brass wheels and jewel bearings or silicon chips — one just assumes that it is designed so the alarm will sound when it is set to sound . . . and is designed more or less accurately, and so forth. (p. 17) In the design stance, a designer creates an alarm clock for a particular purpose. Any time the alarm clock does not perform the task it is designed to perform, we consider it to be the fault of the designer, not because the alarm clock betrayed us. When we take the intentional stance towards an entity, however, we go beyond the physical and design stance and assume that the other entity has some sort of intentionality. When we predict that a bird will fly away when it sees that a cat is coming, we are using the intentional stance to understand the behavior of the bird. The bird has a mind of its own and has deliberately decided to move away from the cat because past experience tells the bird that cats sometimes eat birds. Generally, we save the intentional stance for complex systems that we cannot predict using either of the other two stances. An example of taking an intentional stance toward digital technology would be a child who says that the computer “cheats.” Sherry Turkle (2005) describes a situation where Robert, an eight-year-old boy, is playing with a computer called Merlin. 14 Merlin plays tic-tac-toe. At one point, Robert gets frustrated with Merlin and accuses the computer of cheating. Other children around Robert argue that Merlin cannot cheat, because “knowing is part of cheating” (p. 51). In this case, Robert feels like Merlin is a social actor that is actually cheating him, but the other children treat Merlin as a machine. Being able to imagine what another artifact’s intentions might be to predict or explain its behavior is an example of taking an intentional stance. Another idea that is closely related to taking an intentional stance towards digital technology is the Computers are Social Actors (CASA) paradigm. In the CASA paradigm, Reeves and Nass (1996) and Nass and Moon (2000) suggest that humans behave towards computers as they would toward humans because humans over-apply social scripts. A script is simply a sequence of behaviors that is commonly used in a particular situation. For example, joke-telling, sharing life stories, and general conversations are all considered social scripts. Social scripts support humans in learning to start and maintain turns in a conversation. People follow scripts without needing to pay attention or use mental resources to plan the next behavior. Nass and Moon suggest that people approach interactions with computers mindlessly and rely on prior social scripts to guide them in their interactions. Evidence for perceptions that digital technologies have characteristics of a social actor comes from a study done by Alvarez-Torres, Mishra, and Zhao (2001). They replicated a study done in second language acquisition (SLA) that found that people tend to believe that a native speaker is more credible than a non-native speaker. Alvarez- Torres and colleagues replaced the human instructor with native speaker or non-native speaker computer tutors. In a recall test, participants in the native speaker computer 15 condition remembered more than participants in the non-native speaker condition. These results suggest that the way participants viewed the computer affected how much they learned from the computer tutorial, just as the way humans view human instructors affects what they learn from human instruction. The participants felt that the software of the native speaker computer voice was more competent and behaved accordingly. It is important to note that the above task deals with a situation in which people can rely on some sort of social script (stereotyping). Instead of looking at a situation where a social script is readily available, Mishra (2006) looked at a situation where a person does not necessarily have a specified and practiced social script. People do not always respond to flattery and praise in predictable ways. Indeed, praise and criticism can be interpreted differently depending on the perceived difficulty of the task, the perceived ability of the student, and the student’s success and failure at completing the task relative to other students. Interpretation of the praise can influence how a person responds to the praise or criticism from the person (eg. Henderlong and Lepper, 2002; Kamins & Dweck, 1999; Mueller & Dweck, 1998). For instance, Henderlong and Lepper (2002) describe a series of studies that show that praise for success in a task perceived as easy can negatively affect a student’s self confidence while being blamed for failing a task perceived as difficult can actually have a positive effect on student confidence. Mishra (2006) replicated a human-human interaction study done by Meyer, Mittag, and Engler (1986). In the study, participants had to detect rules in logic problems. Rather than being provided by humans as it was in the Meyer et al. study, computers provided through a very simple textual interface. Students liked praise from the computer irrespective of the difficulty of the task and their success at it. The fact that 16 students preferred praise from the computer suggests that on some level, students valued the computer’s opinion. However, responses did not fully match the results found by Meyer et al. (1986) in that praise never actually negatively affected the student. Mishra (2006) argues that the difference in results may be due to participants not being willing to engage in the mentally taxing task of understanding the other agent’s intentionality. This differs from Nass and Moon’s (2000) hypothesis that people are simply responding in an automatic social script-based fashion. If people were responding this way, then results should have been identical to the complex human-human interaction (praise for success in a task perceived as easy can negatively affect a student’s self confidence while being blamed for failing a task perceived as difficult can actually lead to a positive effect). This result suggests that we are not always following social scripts when interacting with computers. Nonetheless, the participants in Mishra’s (2006) study did seem to take an intentional stance toward the computer in that they liked to be praised by the computer. In Dennett’s (1987) philosophy, a person takes an intentional stance to another entity when one cannot use a physical or design stance. However, in Mishra’s (2006) study, the person could have taken a physical or design stance, and there would have been no difference in the participants’ preference for praise to no feedback. Instead, I suggest that a person would move to the mentally taxing intentional stance to avoid losses or to gain in social or financial standing. If I know what another person is thinking, I can benefit more from the given situation. For example, in multiple motive games both parties must trust each other in order to both gain something. However, if one person betrays another person, then the betrayed party loses something. But the only way to 17 know what choice the other is going to make is to try to think like the other person — and assume that other person has intentionality. When you have something to gain or lose, it is beneficial to think about the intentionality (intentional stance) of another entity. When writing a paper, I think about whether my computer is going to crash or not. I would lose the paper if the computer crashed. Thus, I press the save button often because I do not trust the computer to do the task itself. Another reason for being forced to think about the intentionality of the other entity may be that the other entity poses some threat. Digital technology can certainly act as a threat. For instance, my adviser does not use Automated Teller Machines (ATM) when depositing money because he does not trust the machine to safely get his money to the correct place. The ATM is a threat in that it could fail and lose my adviser’s money. A teacher may need at some point to decide whether to use an online submission program. The online submission program is threatening in that 1) the teacher may not know how to use it and 2) the program could lose papers or somewhat change them in such a way that the teacher might look incompetent. The teacher is forced to decide whether to trust that digital technology. The teacher must also take into account other factors and decide whether to use that digital technology. Overall, the teacher benefits by considering the losses and gains involved in using a particular digital technology for a particular task. In the next section I review definitions of trust and suggest a way to conceptualize trust in technology. Conceptualizing Trust in Technology If we never trusted anyone, we could never learn anything useful from anyone else; people might not be telling us the truth. Additionally, it would be difficult to 18 cooperate with other people in joint ventures; the other people may fail to honor their side of the deal. If we could not trust others, then a great deal of time would need to be devoted to suspecting all others of lying, being incompetent, of not being dependable, or of simply being mean and out to harm everyone else. Putnam (1993) suggests that trust is important in understanding the origins of civic engagement. Further, F ukuyama (1995) argues that one of the most important outcomes of trusting beliefs and behaviors is spontaneous sociability. Spontaneous sociability refers to “the myriad forms of cooperative, altruistic, and extra-role behavior in which members of a social community engage, that enhance collective well-being and further the attainment of collective goals” (Kramer, 1999, p. 583). In general, trusting others creates a more efficient society by allowing humans to focus on being productive rather than always being suspicious. On an individual rather than an organizational level, one can invoke Maslow’s hierarchy of needs to understand how trust might influence cognition. In Maslow’s (1943) hierarchy of needs, before a person’s cognitive and aesthetic needs can be met, safety and feelings of belonging must be met. To feel safe and to feel a sense of belonging, one must be able to trust those around them. Through this perspective, trust is a prerequisite to learning, exploring, discovering, creating, and ultimately self- actualization. Overall, trust allows for a more efficient and more self-actualized society. Clearly trust is important in society, but what is trust and how can we study the trust relationship between teachers and technology? Pajares (1992) notes that beliefs are a “messy construct” because of the poor conceptualizations of the beliefs and belief structures being studied (p. 307). Yet he also notes that, 19 ...when [beliefs] are clearly conceptualized, when their key assumptions are examined, when precise meanings are consistently understood and when specific belief constructs are properly assessed, they can be the single most important construct in educational research. (p. 329) As a belief, trust is a messy construct, and in order to take advantage of its power as a predictive construct, it must be examined and conceptualized in a clear way. Trust can be used with reference to people (mother, friend, teacher), to objects (ATM, computer, alarm clock), and even to abstract concepts (science, religion). How can we conceptualize trust in such a way that it captures all the ways in which the word trust can be used? According to the Oxford English Dictionary, the word “trust” appears to come from the Old Norse word, traust, which means confidence. But do we have the same type of confidence in people as we do in objects? Does someone have the same kind of confidence in a teacher as in their alarm clock? Tshannen-Moran and Hoy (2000) would argue that people do not have the same kind of trust in people and objects because the alarm clock is not a social actor. For example, one may have confidence in the alarm clock, but when the alarm clock breaks, one does not necessarily feel that trust has been betrayed by the alarm clock. In this view, reliance without the possibility of betrayal is not the same kind of trust as reliance with the possibility of betrayal (Baier, 1986). Thus, it seems important that trust be conceptualized as a more complex set of ideas that allows us to capture, compare, and contrast the kind of trust between people and the kind of trust that occurs for technology. Hardin (2002) views trust as a three-part relation involving properties of a truster, attributes of a trustee, and the specific context or domain over which trust is bestowed. 20 For Hardin (2002), trustworthiness encapsulates the attributes of the trustee. While a model is different than a definition, models are built upon definitions of phenomena in hopes of understanding and predicting the particular phenomenon. Therefore, in the following discussion, the terms definition and model are used interchangeably. Within each section below, I briefly describe current approaches to understanding trust and how each model attempts to take into account properties of the truster, trustee, and the context. Moral Models Philosophers tend to conceptualize trust as being ethically and morally justifiable behavior. For example, Hosmer (1995) suggests that trust is “the expectations of one person, group, or firm of ethically justifiable behavior—that is, morally correct decisions and actions based upon ethical principles of analysis—on the part of the other person, group, or firm in a joint endeavor or economic exchange” (399). Moral models focus on the morality of the trustee. Rational Choice Models Unlike moral models, which focus on aspects of the trustee, rational choice models focus specifically on properties of the truster. One element that social scientists tend to agree upon is that trust is a psychological state (Kramer, 1999). When conceptualized as a psychological state, trust must be defined with the help of cognitive processes. Economic decision-making research has been influential in understanding the decision-making processes that humans use. According to a utility-based view of decision making, humans are rational creatures that calculate the costs and benefits of a choice before behaving. Based on this idea, one can view trust as a rational decision. 21 In other words, a rational choice perspective of trust views decisions about trust as being similar to other forms of risky choice; individuals are presumed to be motivated to make rational, efficient choices. From an economic and sociological standpoint, trust can be considered a rational calculation of costs and benefits. In this view trust is a voluntary action, and if the person in whom trust is placed (trustee) is trustworthy, then the trustor will be better off than if he or she had not trusted the trustee. Conversely, if the trustee is not trustworthy, then the trustor will be worse off than if he or she had not trusted (Coleman, 1990). Trust in rational choice models can be inferred through behavior in multiple motive games. In a rational choice model, trust is the behavior whereas trustworthiness is the characteristic of the trustee. A multiple motive game is based on a phenomenon called the prisoner’s dilemma (PD). In the traditional prisoner’s dilemma situation, two suspects are arrested by the police for a crime. The police do not have enough evidence to convict either suspect. Therefore, the police separate the suspects into two different rooms and tell the suspects that if they testify for the prosecution against the other and the other remains silent, the betrayer goes free and the other non-compliant suspect receives the full prison sentence. If both stay silent, both prisoners receive a small sentence for a minor charge. If each betrays the other, each receives a small number of years in prison. Each prisoner must make the choice of whether to betray the other or to remain silent. However, neither prisoner knows for sure what choice the other prisoner will make. So this dilemma poses the question: How should the prisoners act? In utility-based decision making, the only concern of each individual party is maximizing his/her own payoff, without any concern for the other’s payoff. In this situation, no matter what the other 22 suspect does, one suspect will always gain a greater payoff by betraying the other suspect. Since betrayal is more beneficial than cooperation, any rational person who calculates the possibility of gains and losses will choose to betray the other suspect in hopes of getting greater gains. This calculation is problematic because if both parties choose to betray, the result is mutual loss. The optimal outcome is for both suspects to remain silent and only get charged with a minor charge. Based on this type of situation, Deutsch (1958) used financial pay-offs as a type of gain in a multiple motive game situation. In Deutsch’s experiment, two participants were asked to play a game. If both players cooperated, they each received $9. If one player cooperated and the other player chose to betray, the betrayer received $10 and the cooperative player lost $10. Players were said to trust each other if they made a cooperative choice because they had made themselves vulnerable to losing money. In this view, trust is an outcome and behavior rather than a process. One either trusts the other person or does not trust the other person. Rational choice models have been criticized because they seem to overestimate a human’s cognitive capacities. For instance, it seems that people rarely engage in the level of calculations that would be necessary to make accurate decisions about whether to trust something (March, 1994). Purely rational choice models of trust often do not take into account the unstable emotions of humans. Finally, neither moral models nor rational choice models do an adequate job of taking into account the social nature of trust. Social System Models On the other end of the spectrum than rational choice models, there are certain definitions that capture the social nature of trust. However, similar to moral models of 23 trust, such definitions do not capture the truster’s willingness to take a risk. Social definitions view trust as a more general attitude about the social systems in which individuals are embedded. For example, Cummings and Bromily (1996) write that trust “is an individual’s belief or a common belief among a group of individuals that another individual or group a) makes good-faith efforts to behave in accordance with any commitments both explicit or implicit, b) is honest in whatever negotiations preceded such commitments, and c) does not take excessive advantage of another even when the opportunity is available” (p. 4). In a similar vein, Fukuyama (1995) suggests that “trust is the expectation that arises within a community of regular, honest, and cooperative behavior, based on commonly shared norms, on the part of other members of the community” (p. 26). F ukuyarna also suggests that trust is an expectation based on social roles and habits, which is reminiscent of moral models of trust. It is not that you are willing to take a risk, but rather that it is a habit to behave in a particular way. We do not think about the decision to trust that another person will stop at a red light - we just expect the stopping behavior out of habit. Putnam’s (2000) conceptualization of bonding social capital is closely related to others’ conceptualizations of trust. Bonding social capital is an exclusive form of social capital. It occurs when strongly tied individuals, such as family and close friends, provide emotional or substantive support for one another. People with bonding social capital are not generally diverse in their backgrounds, in contrast to people with bridging social capital. Because bonding social capital creates a close group, there is generally out-group antagonism. Formation of any group often leads to feelings of mistrust of any people outside the group (Sherif, 1988). While Putnam (2000) does not label the phenomenon as 24 trust, it does seem that bonding social capital is a conceptualization of trust — both as a network and an outcome. Relational Models Combining rational choice models and social system models, relational models of trust view trust as both a calculation and a social orientation toward other people and toward society in general (Kramer, 1999). The social orientation of the truster toward the trustee can be considered the role or identity-related needs of a person. Further, the motives behind trusting actions can be social rather than just resource-based, as is generally the case in rational choice models. With reference to schools in general, Bryk and Schneider (2002) write that relational trust “views the social exchanges of schooling as organized around a distinct set of role relationships: teacher with students, teachers with other teachers, teacher with parents and with the school principal” (p. 20). Each person has a particular role with particular obligations that must be met to maintain trusting relationships. For instance, a teacher is expected to be competent and a parent is expected to care about his or her child. These are social expectations, not economic expectations. When expectations are not met, then trust is withdrawn, leading to weakened relationships. Reminiscent of moral models of trust, relational models of trust involve expectations. However, these expectations are socially driven, similar to social system models of trust. In addition, relational trust models suggest that trust involves personal reflection upon decision making, which is reminiscent of rational models of trust. Tschannen-Moran and Hoy (2000) conceptualize trust in a multifaceted manner similar to Bryk and Schneider (2002), which seems useful for drinking about trust and 25 technology. Tschannen-Moran and Hoy (2000) review many definitions of trust and ultimately suggest that trust is one party’s willingness to be vulnerable to another party based on the confidence that the other party is a) benevolent, b) reliable, c) competent, d) honest, and e) open. Relating this model back to Hardin’s (2002) model of trust, benevolence, reliability, competence, honesty, and openness are attributes of the trusteee — or the trustworthiness of the trustee. Vulnerability is an aspect of the truster. Indeed, it is the lack of taking into account vulnerability of the truster, or teacher, that is problematic when thinking about extending a social system model to teacher trust in technology. According to Tschannen-Moran (2004), benevolence is the confidence that one’s well-being or something one cares about will be protected by the trustee. Reliability is made up of a combination of predictability and caring. She argues that while we may expect a person to act in a consistently malicious way, we do not actually trust this person. The trustee could be thought of as predictable, but not reliable, and therefore, not trustworthy.. Competence is the ability of the trustee to perform a task as expected. Honesty is related to the integrity of the trustee. Openness deals with how open a trustee is to new information. Rather than focusing entirely on the truster’s behavior as is the case with rational models, relational models use scales and surveys to measure beliefs of the truster about properties of the trustee. Tschannen-Moran and Hoy (2000) review many different surveys and scales used to measure trust. In their review, they divide up the instruments into measures of behavior, general trust, trust in intimate relationships, and organizational trust. Betrayal or remaining silent in the prisoner’s dilemma discussed above is an 26 example of a behavioral measure of trust. Surveys of general trust look at how much people trust a variety of social actors, including media, parents, and people in general. For example, Gardner and Benjamin (2004) asked 45 individuals from the greater Philadelphia metropolitan area to comment on the state of ‘trustees’ in America. ‘Trustees’ were defined as well-known individuals whose opinions are trusted and therefore looked to for advice and guidance on relevant issues. The study found a general decline in the number of people that were trusted. Interestingly, they found that while individuals tend to blame the media for the loss of trust, they also mentioned media personalities (e. g., Oprah Winfrey, Tom Brokaw, Rush Limbaugh) as the individuals that they trusted most of all. Surveys about trust in intimate relationships ask participants to make judgments about the trustworthiness of a specific intimate partner. Rempel, Holmes, & Zanna (1985) asked participants how much they agreed with statements like “I can count on my partner to be concerned about my welfare” and “Though times may change and the future is uncertain, I know my partner will always be ready and willing to offer me strength and support.” While such surveys were not designed or used specifically for teachers and students, such surveys may be useful for developing an understanding of trust in specific teacher-student or teacher-technology relationships. Finally, organizational trust surveys are usually about how much people trust others in the workplace. For example, Cummings and Bromily (1996) have a survey with twelve items that deals with trust in the workplace. Example items include things like “In our opinion, is reliable” and “We feel that _ negotiates with us honestly.” Similar to this type of survey, Hoy and Tschannen—Moran (2003) developed 27 what they call The Omnibus T-Scale. The survey is a measure of faculty trust, which has three subtests: faculty trust in colleagues, in the principal, and in clients (parents and students). In the 26-item scale, teachers were asked the extent to which each statement characterizes their school along a 6- point scale from strongly disagree to strongly agree. Example items included: “Teachers in this school generally look out for each other,” and “The principal in this school typically acts in the best interest of the teachers.” Factor analysis found that the test had distinct components, which the authors characterized as benevolence, reliability, competence, honesty, openness, and vulnerability (Hoy & Tschannen-Moran, 2003). Alpha coefficients of reliability for each scale were consistently above 0.90. Discussion of Models Looking back at the different models, we can see when placed into Tschannen- Moran and Hoy’s (2000) framework, moral models include aspects of benevolence and honesty, but do not include any sense of willing vulnerability, reliability, competence, or openness. Rational models touch on a person’s willingness to be vulnerable, and in a multiple motive game, beliefs about the trustee could be manipulated to assess how different levels of the openness, benevolence, reliability, honesty, and competence influence the final decision of whether to trust or not. In general, social system models of trust do not take into account the willingness for vulnerability that is a key part of trust in technology. Tschannen-Moran and Hoy suggest that by using a model of trust that includes different facets within the concept, it becomes possible to look at different kinds of trust in different kinds of situations. For example, when we trust that people will obey the 28 rules of the road, e.g., stopping at red lights and yielding at yellow lights, this could be considered a moral choice. The morality stems from understanding that we must all give up a piece of our selfish desires (to simply run the red light) for the benefit of everyone. This motive could be driven by benevolence for society as a whole or could simply be a habit, as F ukuyama might argue. On the other hand, we also use an ATM machine to deposit money. In this situation, the trustee is a machine. Because of the protean nature of technology, it becomes essential to have a multifaceted definition of technology. Building on Tschannen-Moran and Hoy’s (2000) framework, 1 suggest that trust in digital technology may be conceptualized as a person’s willingness to be vulnerable to technology based on the confidence that the technology is a) benevolent, b) reliable, c) competent, d) honest, and e) open. In the next section I describe in more detail how each facet might play out in the classroom. Extending a Relational Model of Trust to Educational Technology Assuming that we do indeed treat digital technologies in a similar way to people when that technology poses a potential threat, trust in digital technology may involve vulnerability, reliability, competence, benevolence, honesty, and openness. Since situational factors might influence a teacher’s willingness to be vulnerable, risk entailed in the situation is included in the section about vulnerability. Below I discuss how the different facets can be used to understand teacher trust in digital technology. Vulnerability and Risk Imagine that you have just hired a new employee. The new employee is young and does not know the skills of the trade, so as a good employer you decide to work very hard to develop a rapport with your employee because you want everyone to have the 29 best possible experience. You take your employee out to lunch and help that person learn all the new skills necessary. You spend time and energy on the employee because you know that this employee with share workload in the long run. As Luhmann (1979, 5, 8,15) says, “Where there is trust, there are increased possibilities of experience and action.” You come to trust that new hire to work efficiently and trust the employee will not betray you by not doing the work. And then that trusted employee quits and leaves you with twice as much work as you would have bad if you had not spent all that time training the new employee. You have made yourself vulnerable by relying on the employee to share the load of responsibility, and the person did not work out. You learn from this experience that you need to be very careful when deciding whom to hire in the future. Teachers are taking a similar risk when they decide to use technology in their classrooms. They must depend on the technology to augment their instruction. For example, I often use short video clips or movies as examples in psychology classes. I depend on my computer to play the DVD and the projector or TV to show the images correctly. I rely on the digital technology to make sure my students get the best possible experience. However, sometimes something goes wrong in the process of getting the images to show. My hard drive might crash. The projector bulb might go out. I might not press the right buttons and waste numerous minutes tinkering with the technology — never getting it to work. This might be ok if we are covering a relatively simple topic or we have time to catch up and watch the video later in the semester. But let’s say it is the day before an exam, and the video material is represented on the exam. All the tests are 30 already printed out. Should I take the chance and trust the technology to work, or should I avoid it? The situation risk level might affect my decision. Related to specificity of the situation, Ertrner (2005) suggested that teachers’ visions for, or beliefs about, classroom technology use did not always match their classroom practices because of situational constraints. For example, teachers’ explanations for the inconsistencies often included references to contextual constraints such as curricular requirements or social pressure exerted by parents, peers, or administrators. Indeed, Marcinkiewicz and Regstad (1996) found that the only significant factor in predicting teachers’ computer use was the expectation of computer use from influential people in the field, such as principals, colleagues, students, and professionals. Therefore, it seemed that when there was some risk involved, like the pressure of a standardized test coming up or a principal telling the teacher to use the digital technology, it changed what the teacher actually did even if their belief about what should be done remained constant. Therefore, one factor that is important in understanding whether a preservice teacher decides how much they trust a technology and whether they would want to use a digital technology in the future is how much risk the situation entails. These are issues teachers have to face more and more as different digital technologies are encouraged in schools. Competence Vulnerability is a facet of the truster whereas competence is a belief that the truster has about the trustee. According to Tschannen-Moran (2000), competence is “the ability to perform the task as expected, according to appropriate standards” (p.30). This facet is similar to McCrory’s (2006) boundary, disciplinary, and authority affordances as 31 well as Hew and Brush’s (2007) subject culture category. A teacher must decide if the material is correct for the subject being taught. If a student feels that her teacher does not know what he or she is talking about, the student will probably not pay attention or learn anything from the teacher. Likewise, a teacher must feel that material found on digital technology is accurate and will be presented in an appropriate way. The substitute teacher in Norwich who was mentioned in the introduction found that the computer’s action of displaying pornography was not competent. As noted by the blogger, the teacher had been “framed by the computer.” 1 Reliability Tschannen-Moran (2000) suggests that reliability is “the sense that one is able to depend on another consistently” in a caring manner (p. 28). For example, an employer could think that her employee is very reliable because her employee arrives on time to all meetings and does the same amount of work every day. Reliability, as conceived by Tschannen-Moran and Hoy (2002), involves the perception that the trustee could betray the trustor in some way. The employee could betray the employer by not showing up to work without any explanation. Reliability could be considered as similar to McCrory’s (2006) stability affordance, in which a teacher must decide hOw stable the technology is over time. An unreliable colleague may be similar to a computer program crashing without explanation. In the event her computer crashes, the teacher (or learner) may actually feel that the computer has betrayed her in some way — even though the computer is not actually capable of betrayal as was suggested earlier (Baier, 1986). There is a sense that the trustee (computer or computer designer) was not reliable when the computer or 32 program stopped working. Even when the computer is simply acting as a conduit, in a threatening situation, a person could view the computer as being unreliable and thus untrustworthy. For example, if a computer crashed in the middle of sending an e-mail or if an important e-mail was sent to a spam box instead of an inbox, a person might feel that the computer is not reliable. Perhaps one incident of such a failure or betrayal leads that person to distrust not just that computer, but all computers. I suggest that reliability with reference to digital technology is the belief that hardware or software will perform as expected in repeated trials. Benevolence Benevolence is “the confidence that one’s well-being or something one cares about will be protected and not harmed by the trusted party” (of. Tschannen-Moran, 2000, p. 19). For example, perhaps you are an employer and have an employee that is competent in that she does a good job. The employee is also reliable in that she shows up regularly. However, you find out that this employee often talks behind your back and gossips about other employees. You have made an emotional investment by befriending the employee, and the employee has betrayed you by not reciprocating that kindness (McAllister, 1995). If you spend all your time worrying about saying the wrong thing in front of this employee, you will ultimately be less productive. Benevolence is a more difficult construct to extend to understand trust in digital technology. It is strange to think of a computer or PDA as having compassion. Nonetheless, the designers behind web sites or software programs could be seen as benevolent. For example, the substitute teacher in Norwich probably feels that the designers of pop-up sites are not very compassionate. In turn, she may feel that the 33 computer itself is not benevolent, and thus she does not trust it. In fact, a strong CASA paradigm would make this argument. Honesty Honesty is the facet of trust that deals with whether another individual can be relied upon to tell the truth, whether it be verbal or written (Rotter, 1967). Without confidence that what another person says is accurate, you run into a similar problem as that of a lack of competence. The difference is that if a person is not competent, that person Can still be benevolent. A dishonest person may be competent, but might not be benevolent enough to tell the truth. If a student reports being sick, but is actually going out of town for a concert, the teacher may feel that student is not honest. McCrory’s (2006) affordance of pedagogical context, in which a teacher must decide about the level of mangement, monitoring, and feedbaCk, could be viewed as technological honesty. Pedagogical context could also be viewed as competence if the feedback was incorrect rather than just dishonest. “Phishing” is an example of digital technology not being honest with a truster. Phishing is an attempt to fraudulently acquire sensitive information, such as passwords and credit card details, by masquerading as a trustworthy entity in an electronic communication. You may get an e-mail that looks like it is from your bank, saying you need to re-enter your information for security purposes. In reality, it is not your bank. Once a victim of phishing, the truster may not trust any of the bank’s e-mails any longer. In a classroom setting, a teacher may purchase an instructional video game that claims to teach students many different mathematical skills. However, when the teacher lets the students play the game, she finds out that the game has lots of inappropriate content or does not do what the designers claim that it does. The 34 teacher loses trust in the designers and perhaps all instructional games because of this experience. Openness Openness is the process in which people make themselves vulnerable by sharing information or control with the truster. This is not a characteristic of the truster, like vulnerability, but rather is a characteristic of the trustee. How much personal information does the digital technology share with the truster? An example of self-disclosure from a form of digital technology would be a particular search engine informing its users that it moves sites up based on their level of sponsorship. A person might not find the site to be benevolent, but might find it to be very open and honest and therefore trustworthy. Conclusions Teachers have to make many decisions about whetherto use digital technology or not for instructional purposes. While there are many barriers that could prevent them from using the technology, trust is perhaps the most important belief to predict technology usage. After decomposing Tschannen-Moran and Hoy’s (2000) multifaceted conceptualization of trust for humans, it seems likely that some facets of trust may be more likely to extend to trust for technology than others. In particular, McCrory’s (2006) five affordances of technology seem to fit well into the facets of competence, reliability, and honesty. The following section outlines how the literature review led to the five research questions asked in this study. Purpose of this Study Technology is a prevalent part of education, and it is essential that we understand why teachers decide to use technology or not for educational purposes. One reason why 35 teachers do not use the technology available may be that they do not trust the technology. One place where future teachers may learn more about different types of technology use or potentially learn to distrust educational technology is in their teacher preparation program. However, there may be different reasons for why teachers trust or do not trust technology than what they experience in teacher preparation programs. Therefore, it is important that we understand different variables that may contribute to distrust of digital technology in pre-service teachers — those teachers that are currently in a teacher preparation program. In this study, I begin to look for systematic differences in why pre- service teachers trust or distrust digital technology in hopes of understanding not only why teachers do not take advantage of digital technology available to them, but also where this distrust arises. In particular, I ask the following questions in this study: 1. How does trust in digital technology compare with trust in other people? 2. Does preservice teacher trust correlate with what that person says she will do in a situation where she encounters digital technology in the classroom? 3. Given a situation that entails some risk, does the level of risk affect whether a preservice teacher says she will trust a digital technology? Does the level of risk affect whether a preservice teacher says she will use a particular digital technology? 4. How do personal traits (gender, race and ethnicity, age, socioeconomic status, grade level the teacher plans to teach, content area the teacher feels most equipped to teach, and computer self-efficacy); psychological traits (how trusting the person is in general and how risky the person is in general); and experiential factors (experience with technology in classes) of the preservice teacher contribute to that person’s decision to trust digital technology? 36 5. To what extent and in what ways do the explanatory variables for trusting a digital technology also explain what a preservice teacher says she will do in their classroom? These questions are not organized by importance. Rather, the answer to question two builds on the answer to question one and so forth. Discussion I discussed the importance of digital technology in education and reasons why teachers might not use it, ultimately suggesting that a lack of trust may be a key reason for lack of implementation. I have discussed how trust has been conceptualized across many different disciplines. As noted by Tschannen-Moran and Hoy (2000), there seem to be several ways of defining and measuring trust. Trust can be viewed in multiple ways, such as a three-part relation involving properties of a truster, attributes of a trustee, and the specific context or domain over which trust is bestowed (Hardin, 2002); a moral decision (Hosmer, 1995); a rational choice decision (Coleman, 1990); a complex psychological state that is dependent on emotional and social influences (Kramer, 1999); an expectation derived from community roles and habits (Cummings and Bromily, 1996; F ukuyama, 1995); or as a relational model (Tschannen-Moran and Hoy, 2000; Hoy and Tschannen-Moran, 2003; Bryk and Schneider, 2000). Complicating the problem of conceptualizing trust is that trust can be used with reference to people (mother, friend, teacher), to objects (ATM machine, computer, alarm clock), and even to abstract concepts (science, religion). Keeping in mind that the kind of trust we have for other humans might be similar the kind of trust we have for digital technology because we do, at least to some extent, treat digital technology like a social actor, it is important that trust 37 be conceptualized as a more complex set of ideas that allow us to capture both trust for people and the kind of trust that occurs for technology. Ultimately, the multifaceted view of trust proposed by Tshannen—Moran and Hoy (2000) is most useful when trying to deconstruct the kind of trust that transpires between preservice teachers and digital technology, which is why I chose to use Tschannen-Moran and Hoy’s (2000) relational model of trust in developing a framework for understanding preservice teacher trust in digital technology. In the following chapter, I show how the literature about educational technology and trust was used to create and test an instrument that measures preservice teacher trust in education technology. Further, I discuss what factors might contribute to variance in trust in technology and how these constructs were measured in a final survey to assess what predicts preservice teacher trust and use of educational technology. 38 CHAPTER THREE Design and Logic of Variables Thus far, conceptualizations of educational technology and of trust have been reviewed and synthesized. To understand what personal, psychological, and experiential factors contribute to a preservice teacher saying they will trust and/or use a digital technology in situations in which risk varies, I developed a survey that included instruments to capture each of the hypothesized influential factors. In each section below I describe the origin of each part of instrument (see Appendix B for the final instrument) as well as the research-based hypotheses of how the different constructs being measured may play out in this study. Dependent Variable: Instrument to Measure Use of and Trust in Digital Technology Before assessing what factors are associated with preservice teacher trust in digital technology, an instrument that measured preservice teacher trust in classroom digital technology had to be created. There are scales that look at teacher attitudes and beliefs toward computers and scales that assess trust in individuals (intimate) and organizations (organizational). For example, Mueller and colleagues (2008) used a survey that assessed the extent to which teachers view computers as an instructional tool or a motivational tool. Participants were asked to what extent they agreed with statements such as, “I see computers as tools that can complement my teaching” (instructional tool) and “I use computers to motivate my students” (motivational tool). However this type of scale did not get at issues related to trust, such as whether the teacher thought the content to be competent or the technology to be reliable. I also thought about extending items from Hoy and Tschannen-Moran’s (2003) Omnibus T- 39 scale and simply replacing the human component with technological components. In their scale, participants reported on a scale from 1 to 6, how much they agreed with statements like “Teachers in this school generally look out for each other” and “The principal in this school typically acts in the best interest of the teachers.” One could imagine a survey that included items like, “Computers typically act in the best interest of the students.” However, in both the attitudes and beliefs scale and the trust scale, Hardin’s (2002) third property of trust, related to context, was not apparent. Context seemed very important when trying to understand why a preservice teacher might choose to trust digital technology and whether they would choose to use the digital technology. Thus, rather than creating items like those seen in scales about attitudes and beliefs about technology, interpersonal trust scales, and organizational trust, I used Tschannen-Moran and Hoy’s (2000) five facets of trust as a guide to create vignettes in which a teacher might have to decide whether they would 1) use the digital technology and 2) trust the digital technology. Once again, Tschannen-Moran and Hoy (2000) suggest that trust is one party’s willingness to be vulnerable to another party based on the confidence that the other party is a) benevolent, b) reliable, c) competent, d) honest, and e) open. Benevolence is the confidence that one’s well-being or something one cares about will be protected by the trustee. According to Tschannen-Moran (2004), reliability is made up of predictability and caring. She argues that while we may expect a person to act in a consistently malicious way, we do not actually trust this person. The trustee could be thought of as predictable, but not reliable, and therefore, not trustworthy. Competence is the ability of the trustee to perform a task as expected. Honesty is related to the integrity of the trustee. 40 Openness deals with how open a trustee is to new information. Based on the extension of these categories to digital technology as described in Chapter 2, vignettes related to situations a teacher will likely encounter regarding classroom digital technology were created. In each situation, the teacher, or in this case preservice teacher, would have to decide what they believed and what they thought they might do when presented with such a situation. The situations were made to be quite specific in hopes that there would be little room for misinterpretation. Related to specificity of the situation, the amount of risk involved in the situation was also considered in the creation of items. For example, a teacher might be more likely to trust a technology when there is little educational risk to the students than in a situation where there is a chance the students might not learn the material if the teacher chooses to trust the technology. Thus, analogous high and low risk items 'were created. A high risk situation is one in which the outcome matters a great deal for both the students and the teacher. A low risk situation is one in which the consequence of the action will be small. Below are examples of high and low risk situations involving competence that a preservice teacher will likely face when they teach. The key difference is italicized below. High: You know that the intemet can provide students with a great deal of information. You would like your students to use the intemet to help them research the civil war. There are some controversial topics involved in discussions about the civil war. You want your students to understand how to use internet sites and learn about the civil war. Do you tell the students to use the intemet for their research? 41 Low: You know that the internet can provide students with a great deal of information. You would like your students to use the intemet to help them research trees. You cannot think of any controversial topics involved in discussions about trees. You want your students to understand how to use internet sites and learn about trees. DO you to tell the students to use the intemet for their research? Pilot Studies Before implementing the instrument with all the independent variables, I made sure the instrument measured high and low levels of situation risk and that the instrument was reliable. Below I describe pilot studies used to assess inter-rater reliability of risk level, test-retest reliability, and scale reliability. Situation Risk Assessment First, I gave the instrument to a group of five educational psychology and teacher education students and discussed the purpose and uses of the instrument. After 24 stories were decided upon, 12 of which were high risk situations (situations entailing a high degree of risk) and 12 of which were analogous low risk situations (situations entailing a low degree of risk), I tested whether the stories were indeed high and low risk situations. The items were then modified to eliminate as much confusion between risk of the situation and trust in the technology. Therefore, to the extent possible, the technology itself was removed as a variable from the situation. Appendix A includes the questions given in this risk assessment pilot. After gaining proper departmental and IRB approval, an online survey was sent to a randomly selected group of 50 teacher education students at a large Midwestern 42 University. The following procedure was followed to ensure the greatest response rate possible: 1. A prenotice email was sent through Zoomerang to alert all preservice teachers that they would receive a survey in the coming days and to explain why the survey is important. 2. Two days after the prenotice email was delivered, a second contact was made to the pre—service teachers. This email reminded them about the prenotice email sent earlier and provided them with their unique link to the survey. 3. Three days after the pre-service teachers received their tmique survey links, the third contact was made to all who had not opted-out of the survey. Participants could opt out at any stage of contact. This e-mail expressed hope that if they had not completed the survey already, that they do so soon. 4. One week after the thank-you contact, only those pre-service teachers who had not yet completed their survey or had not opted out received a follow-up email. This email encouraged them to submit their surveys, and provided the link to the survey again in case they had deleted previous emails. The students were given the following instructions at the beginning of the instrument: Directions: Imagine you are a teacher in the following situation. Based on the information given in each story, is this a high stakes situation or a low stakes situation? On a scale of 1-5, with I being high stakes and 5 being low stakes, what is at stake for you as a teacher and your students as learners? A high stakes situation is one in which the outcome matters a great deal for both the students and the teacher. A low stakes 43 situation is one in which the consequence of the action will not affect the student or teacher as much as in a high stakes situation. For example, see the two situations below: You can use either a brand new textbook or a textbook that is ten years old to teach students about Genetics. The new textbooks are a bit more expensive and you would need to ask the district for additional funding. Your school already has the older textbooks and you would not need to ask for any additional funding if you used these. Do you ask for more money to get the new textbooks or go ahead and use the old ones? You can use either a brand new textbook or a textbook that is ten years old to teach students about the Industrial Revolution. The new textbooks are a bit more expensive and you would need to ask the district for additional funding. Your school already has the older textbooks and you would not need to ask for any additional funding if you used these. Do you ask for more money to get the new textbooks or go ahead and use the old ones? Now think about what is at stake in each situation. Genetics is a new area in which new findings are constantly being made whereas the Industrial Revolution took place in the th th . . . late 18 and early 19 century. There is more at stake in the first sztuatlon than m the second situation since there is a lot about Genetics that was not known 10 years ago, whereas most of what is known has already been written with regards to the Industrial 44 Revolution. Therefore, you would give the first situation a 1 or 2 (depending on how high you personally think the stakes are) and you would give the second situation a 4 or 5. Some situations are more diflicult then others, so a textbox is provided for you to explain your answer. However, no text is necessary for you to continue ranking the level of stakes for each situation. Twelve of the 50 students responded to the request, giving a response rate of 24%. This level of participation may be due to a range of reasons. First, it was during the middle of the semester and many students may have been too busy with classes to do the survey. Further, there was no incentive for them to participate. At the end of the survey, I asked students how seriously they took the survey on a 1 to 4 scale, with l-very seriously, 2-somewhat seriously, 3-not very seriously, and 4-not seriously at all. One student answered that he or she did not take it very seriously and results could not be used. The other 11 participants’ data were used in data analysis. Paired samples t-tests were computed for each pair of high and low risk situations to establish construct validity. Any pairs that were not significantly different at the .01 level were pulled out (6 pairs). Based on the comments from participants, five of the six pairs of items were rewritten in an attempt to make them more discriminatory. These new items were sent to the same group of 11 students to see if the ratings would change based on the new wording of the items. Seven students responded, but one presumably did not read the directions and gave the same rating for all of the items. This person’s responses were eliminated and analysis was done with the responses from the remaining six students. 45 Four of the 12 pairs of items were not significantly different even with changes and were eliminated, leaving eight pairs of items (16 total items). All eight pairs were both practically different and significantly different at the .05 level, suggesting that, in general, people agreed that these high risk items were indeed riskier than their counterpart low risk items. Test-retest Reliability and Scale Reliability When thinking about reliability of the instrument, there were two key considerations — how much the items fell together as a coherent instrument and whether people would respond to the items consistently over time. Therefore both test-retest reliability and scale reliability were completed after inter-rater reliability was established on the items. To assess test-retest reliability, the online survey was sent to a randomly selected group of 50 teacher education students at Michigan State University. These students were told that they would receive a $5 Amazon gift card after completing both the present survey and the one that would be sent to them in two weeks. Seven of the 50 students completed both parts. To get more data, I asked education graduate students to complete the two surveys with the same directions. Six more participants completed the survey, for a total of 13 participants. All 13 participants reported taking the survey ‘ seriously. Because I was interested in both the belief of trust and the reported usage, the two questions below were modified for each specific situation and asked after each vignette (see Appendix B for the final survey): 1. On a scale of 1-5 (with 1 being not likely to 5 being very likely), how likely are you to use the technology? 46 2. On a scale of 1-5 (with 1 being I don’t trust it at all and 5 being I completely trust it) how much do you trust the technology? Two pairs of items were deleted to increase the scale reliability at Time 1. I did not use Time 2 reliabilities because there could be some undesired practice effects that influenced the scale. A paired samples t-test found no difference in scores from Time 1 to Time 2 for any of the pairs, suggesting that participants were stable in their beliefs over the two—week period for all the final 12 items (6 pairs). Ultimately four items were deleted leaving 12 items: 6 high risk and 6 analogous low risk items. The final scale reliabilities were .70 for the Trust questions and .75 for the Reported Usage questions. Test-restest reliability was high for both Trust (r(12) = .85, p<.05) and Reported Usage items (r(12)=.79, p<.05) The final set of 12 items, which exemplify facets of reliability, competence, and honesty, can be seen in the final instrument in Appendix B. It is interesting to note that items developed to investigate benevolence and openness did not fit statistically with the scale developed and were the four deleted. Question one in this study was how trust in digital technology compared with trust in humans. Perhaps we tend to associate benevolence and openness with humans, but not with digital technology. This will be discussed further in Chapter 6. Predictive Variables At this point, I move to a discussion of the how potential predictive variables were measured. These include personal traits (gender, race and ethnicity, age, socioeconomic status, what level the teacher plans to teach, what content area the preservice teacher feels most equipped to teach, and computer self-efficacy), 47 psychological traits (how trusting the person is in general and how risky the person is in general), and experiential factors (experience with technology in classes). Demographic Factors and Individual Skills In this section, gender, race, age, socioeconomic status, what level the teacher plans to teach, and what content area they feel most equipped to teach are all addressed. Simple demographic data was collected in this part of the survey (see Appendix B). With regards to gender, it was reported that men are more confident working with and learning from computers than women (Garland and Noyes, 2008). Therefore, it seems likely for men to say they will use digital technologies more than women, even if they do not trust digital technologies as much as women. Computer Self-Eflicacy General computer self-efficacy refers to “an individual’s judgment of efficacy across multiple computer application domains” (Marakas, Yi, and Johnson, 1998). To measure computer self-efficacy (CSE), I used Khorrami-Arani’s (2001) modified version of Murphy et al.’s (1989) computer self-efficacy scale to test whether people with greater CSE are more likely to say they will use digital technologies. Murphy’s (1989) 32-item Likert-type scale was developed to measure individuals' perceptions of their capabilities regarding specific computer-related knowledge and skills. Items represented beginning, moderate, and advanced computer skills. For example a beginning skill would have an item like, “I feel confident working on a personal computer.” An example of an advanced computer skill item is, “I feel confident explaining why a computer program does not work on a computer.” Khorrami-Arani (2001) reported an internal consistency 48 reliability of .95. It seems likely that the more confident a person is with computers, the more likely he or she will report trusting and using the technology. General Trust This study used the Social Values Orientation (SVO) Scale used by Yamagishi (1986) as a predictor of one’s predisposition to trust others. The SVO scale is considered a good predictor of one’s predisposition to trust strangers and has been used by previous researchers to look at the correlation between trust and the propensity to contribute to public goods (Yamagishi, 1986; Yamagishi and Sato, 1986) as well as between trust and the propensity to cooperate in a prisoner’s dilemma game (Parks etal., 1995, 1996). The SVO scale is short and consists of five questions that are each answered by choosing one of the five options: strongly disagree, disagree, neutral, agree and strongly agree. An answer of strongly disagree gets a 5 while strongly agree gets a 1 except for question 4 which is reverse scored. The lowest possible score is 5 (least trust) and the highest possible is 25 (most trust). An example of an item on this scale is, “Most people are basically honest.“ If trust in other people is identical to trust in technology, it seems likely that a measure of how trusting that person is will likely predict his or her level of trust in technology. Risky Behavior When entering into a social relationship, a person must initially take the risk that they will be betrayed before trust can develop. In a sense, risk is essential for trusting relationships. This same concept can be applied to teacher trust in digital technology. For example, if a teacher never tries to use an online submission program, he or she will never need to trust it. That person must risk that it will not work to be able to take 49 advantage of the educational benefits that relationship might have to offer. Therefore, it seems logical that people who are riskier in general will be more likely to trust digital technology across many situations. To measure personal risk I used the The Attitudes Towards Risk Questionnaire (RISK; Appendix B). Developed by Franken, Gibson, & Rowland (1992), the short version consists of 10 items. In this study, participants answered each item by indicating how much a statement describes them. Half of these questions (numbers 2, 3, 8, 9 and 10) are psychological risks items, such as, “I do not let the fact that something is considered immoral stop me from doing it”. The other half of the questions (numbers 1, 4, 5, 6 and 7) are physical risks items, such as, “I like the feeling that comes with taking physical risks”. Responses will be given on a 5-point scale ranging from 1 (not like me) to 5 (like me). If a person likes to take risks, that person may be more likely than someone who does not like to take risks, to report that they will use the technology in high risk situations. Experience with Digital Technology This part of the study included questions about students’ experiences with technology in classes taken at the university level. Hill et. al. (1987) found a significant positive correlation between previous computer experience and computer self-efficacy beliefs in a sample of 133 female undergraduates. They also found that experience influenced behavioral intentions to use computers through self-efficacy beliefs. Also, as mentioned in Chapter 2, Kellenberger (1997) found that preservice teachers’ ratings of the value of computer use was associated with how likely they said they would be to use computers in future classrooms. Therefore, I suggest that more positive experiences with 50 digital technology in their own classes will lead to the preservice teachers saying they will be more likely to trust and use technology in their own future classes. Conclusions Thus far, using the conceptualizations of classroom digital technology and trust, an instrument to measure preservice teacher trust in classroom digital technology was developed and piloted. This instrument was integrated into a larger survey with previously designed and tested instruments to begin to answer how personal, psychological, and experiential factors contribute to preservice teachers’ reasoning for reported usage of digital technology in their future teaching. The next chapter will discuss how preservice teachers were selected and how the data was collected. 51 CHAPTER FOUR METHODS Participants After receiving approval from the institutional review board (IRB), Teacher Education Department, and Registrar’s Office, a random sample of 699 undergraduate students in the Teacher Education program from a large Midwestern university were sent an e-mail asking them to participate in the study. At this school, to be admitted to the Teacher Education program a student must have 56 credits by the beginning of the semester they are admitted (typically as juniors) and have an overall grade point average of 2.75 or higher. Of the 699 students asked to participate, 136 students completed the survey for a response rate of 19%. This response rate that may be due to several factors. First, the survey could not be sent until approval was gained near the middle of the semester, when most students are busiest. Second, the instrument took between 30 and 60 minutes to complete. This is a substantial amount of time for a busy preservice teacher. Finally, the survey was only given online. It may be that students who do not use computers as much were less likely to complete the survey. Ideally, responders would have been compared to nonresponders, but this was not feasible within the constraints of the existing system. Nonetheless, steps were taken to discover how representative the population of this study was of the population of all US. preservice teachers. First, I compared the demographic information from this study’s demographic results to those found nationally. Second, I compared the final 25 responders with the first 111 responders to see if those responders that needed additional coaxing were systematically different than 52 those that did need such coaxing (more closely associated with nonresponders). Ultimately, the comparison suggests that even though the number of participants is relatively small, it is at least somewhat representative of the American population of preservice teachers. Gender and Level Preservcie Teacherrs Plan to Teach More future elementary school teachers (75) took this survey than did future secondary teachers (60). Further there were more females (112) than males (23). Two students did not respond and did not select a gender. This distribution of gender is consistent with prior research that suggests that there are indeed more females than males in teacher education program. For example, the NBA (2003) reported that 79% of the teachers it surveyed in 2000 and 2001 were female (cf. Zumwalt and Craig, 2005). Further, in a meta-analysis of 44 studies of teacher education studies of teacher education students, Brookhart and Freeman (1992) found 75% to 80% were female (cf. Zumwalt and Craig, 2005). Further, this number was as large as 85% amongst future elementary school teachers (AACTE, 1999). Table 1 shows what level a student reported planning to teach grouped by gender. Table 1. Breakdown of Gender and Level the Student Plans to Teach Level Preservice Teacher Plans to Teach Elementary Secondary Total Male 2 @%) 21 (35%) 23 (17%) Gender Female 73 (97%) 39 (65%) 112 (83%) Total 75 (56%) 60 (44%) 135 *2 students did not answer the above questions and are not included 53 Area of Interest The majority of men that took this survey were interested in teaching science and mathematics (64%), whereas only 39% of women were interested in teaching science and mathematics. This is consistent with the suggestion that women are underrepresented in mathematics and science fields (e.g., Xie and Shaumann, 2003). Table 2 shows the breakdown of interest area by gender. For later analysis, Applied arts, English, and Fine Arts were coded as 0 (Arts). Life Science, Mathematics, Physical Science, and Social Science were coded as 1 (Math/Sciences). Table 2. Breakdown 0 Interest Area by Gender Female Male Applied Arts 2 (2%) 0 English 45 (40%) 6 (26%) Fine Arts 11 (10%) 0 Life Science 11 (10%) 0 Mathematics 20 (18%) 5 (22% Physical Science 5 (4%) l (4%) Social Science 19 (17%) 11 (48% Age The majority of students (90%) that took this survey and answered the question about age were between 20 and 22 years of age. Results found 56 (41%) of the students were Juniors and 75 (55%) of the students were Seniors. The other 4% (seen in Table 3) were 19 or over 30—years-old. 54 Table 3. Breakdown of Preservice Teacher by Age Frequency (percentage) 19 years 2 (2%) 20 years 50 (3 7%) 21 years 57 (42%) Age 22 years 15 (11%) 23 years 6 (4%) 24 years 1 (1%) 25 years 1 (1%) 26 years 1 (1%) Above 30 3 (2%) Race Table 4 shows that 91% of the preservice teachers that took this survey were white. Only 9% of the students that took this survey were a minority. In the studies reviewed by Brookhart and Freeman (1992), the percentage of Whites ranged from 80% to 96%. This suggests that results of this study are quite similar to those found nationally. For later analysis, white students were given a code of 1 and all other races and ethnicities (minority) were given a code of 0. These dummy codes were necessary for later regression analysis. Table 4. Breakdown iStudent Race Frequency 4Percent) White 124 (91%) Minority Total Minority 12 (9%) Total 1 3 6 Socioeconomic Status As noted earlier, a common proxy for socioeconomic status is mother’s education level. In this study, both mother education levels and father education levels were 55 collected. Table 5 shows a great deal of variance in mother education level, suggesting a range of socioeconomic status of students. Table 5. Breakdown 0 Preservice Teachers by Mother and Father Education Level Mother Father High School 38 (28%) 40 (29%) Bachelors 60 (44%) 44 (32%) Education Level Masters 29 (21%) 34 (25%) MBA 2 (2%) 6 (4%) PhD 4 (3%) 8 (6%) I don't know 3 (2%) 4 (3%) Total 136 136 Like the historical finding (NBA, 2003), males were less likely than females to have a mother with at least some college education (see Table 6). Table 6. Breakdown of Preservice Teachers Mother Education Level by Gender Mother Education Level Male Female High school 4 (21%) 34 (47%) At least some College 19 (79%) 72 (53% Early versus Late Responders A problem often encountered in survey implementation is that the participants that choose to respond are systematically different than those who choose not to respond. Since this is a possibility, students that responded early, after only one to four reminders (111 preservice teachers), were compared to those who needed five reminders (25 preservice teachers) before responding. It seems likely that without the additional contact, the late responders would not have completed the survey and are therefore at least somewhat comparable to those that did not complete the survey. Independent samples t-tests were performed to compare the late and early responders. No significant differences, both practically and statistically, were found 56 between the early and late responders on any of the individual items (both the usage and trust questions) in the final instrument. Further, there were no differences found between the early and late responders in their general trust, computer self efficacy, mother education level, age, valence of experience with digital technology in their general university classes, teacher education classes, and classes in Erickson Hall (the Education building on campus), and whether they plan to use computers when they teach in the future. Interestingly, the early responders (20.50) scored lower than the late responders (24.40) on the Attitude Toward Risk Questionnaire (t(134)=-2.736, p<.05), suggesting that early responders are less likely to take risks than late responders. Representativeness of the Sample Population Overall, demographic findings from this study were proportionally similar to those found nationally. This consistency suggests that despite possible differences between those students that took this survey and those that did not, this population is representative with respect to general demographics. Early and late responders were similar except that the early responders scored lower than the late responders on the Risk Questionnaire. Instruments The instrument described in Chapter 3 and seen in Appendix B included questions about 1) demographic information, 2) technology usage and competency, 3) general trust beliefs and riskiness, 4) experience in classes at their university and in their program, and 5) trust in classroom digital technology. This instrument was administered to students through Zeomerang, an online survey program. The complete survey took between 30 and 60 minutes to complete. 57 Study Design This is a mixed methods exploratory study that uses 1) a within-subjects design to test whether the manipulated risk level affects reported responses, 2) qualitative cements to help explain preservice teachers’ reasoning in more detail, and 3) regression and path analyses to discern which independent variables influence preservice teachers’ reported decision to trust and use educational technology. Procedure First, the dependent instrument designed to measure preservice teacher trust in digital technology was piloted and tested (see Chapter 3 for more details). The final scale reliabilities were .70 for the Trust questions and .75 for the Reported Usage questions. Test-restest reliability, measured by Pearson’s correlation coefficient, was high for both Trust items (r(12) = .85, p<.05) and Reported Usage items (r(12)=.79, p<.05) The complete instrument (found in Appendix B), was administered using the following procedure: ‘ l. A prenotice email (Appendix D) was sent through Zoomerang to alert all pre- service teachers that they would receive a survey in the coming days and to explain why the survey is important. Students were also informed that they would receive $2 in Spartan cash and their names would be entered into a raffle to win a $50 Amazon gift card as an incentive. Only students who completed the survey were eligible. 2. Two days after the prenotice email was delivered, a second contact was made to the preservice teachers. This email reminded them about the prenotice email sent earlier and provided them with their unique link to the survey (Appendix E). 58 3. Three days after the pre-service teachers received their unique survey links, the third contact was made to all who had not opted-out of the survey. Participants could opt out at any stage of contact. This e-mail expressed hope that if they had not completed the survey already, that they do so soon (Appendix F). 4. One week after the third contact, only those pre-service teachers who had not yet completed their survey or had not opted out received a follow-up email. This email encouraged them to submit their surveys, and provided the link to the survey again in case they had deleted previous emails. This step was repeated three more times for a total of 7 contacts. See Appendix G for the last request sent to the students. If the student chose to participate, they were directed to a consent form in the online survey. By pressing submit, they agreed to participate in the survey. All responses were recorded in Zoomerang and then downloaded into EXCEL and SPSS for further analysis. 59 CHAPTER FIVE Results At this point, a survey of literature discussing classroom digital technology and conceptualizations of trust has been elaborated. It has been argued that using a multifaceted view of trust allows for the comparison of trust in technology with trust in humans. A survey was developed and implemented to address questions about preservice teacher trust and reported usage of educational technology. This chapter discusses the findings by addressing each of five research questions. 1. How does trust in digital technology compare with trust in other people? 2. Does preservice teacher trust correlate with what that person says he or she will do in a situation where he or she encounters digital technology in the classroom? 3. Given a situation that entails some risk, does the level of risk affect whether preservice teachers say they will trust a digital technology? Does the level of risk affect whether preservice teachers say they will use a particular digital technology? 4. How do demographics and individual skills (gender, race and ethnicity, age, socioeconomic status, what level the teacher plans to teach, what content area the preservice teacher feels most equipped to teach, and computer self-efficacy), psychological traits (how trusting the person is in general and how risky the person is in general), and experiential factors (experience with technology in classes) of the preservice teacher contribute to that person’s decision to trust the technology? 5. To what extent and in what ways do the explanatory variables for trusting a digital technology also explain what preservice teachers say they will do in their classroom? 60 Trusting People Versus Trusting Technology This project explored how preservice teachers understand educational technology and whether that understanding influences whether they say they will use technology in their own future classes. In this section, I review how the scale reliabilities and the relationship between general trust scores and trust in technology scores suggest a potential difference between trusting people and trusting technology. When trying to build a reliable instrument to measure teacher trust in technology, items were created to reflect the five facets of trust. However when computing scale reliabilities, only items that were based on facets of reliability, competence, and honesty met the statistical criteria needed to be part of the scale. Responses to items built based on openness and benevolence did not fit in the same scale as the other items. Scale reliabilities alone do not indicate that trust in technology is different from trust in people in that only trust in people involves elements of openness and benevolence. It could be that if I created different items base on different types of technology, one could find that items related to openness and benevolence would fit into the scale. However, when looking back at the literature review in Chapter 2, both McCrory (2006) and Hew and Brush (2007) only include elements of reliability, competence, and honesty when discussing teacher’s relationships with technology. Thus, both the scale reliabilities and the literature review suggest that trusting technology involves reliability, competence, and honesty, but not necessarily openness and benevolence. Third, neither the preservice teachers’ trust of people in general, as measure by the Social Values Orientation (SVO) Scale (Yamagishi, 1986; Yamagishi and Sato, 1986) nor the preservice teachers’ desire to take risks, ask measured by The Attitudes Towards 61 Risk Questionnaire (Franken, Gibson, & Rowland, 1992) predicted trust in technology scores as measured by the scale developed in this study. Preservice teachers’ scores on the SVO ranged from 7 to 24, with a mean score of 15.14 (SD=3.047). This is higher than results found by Chaudhuri, Khan, Lakshmiratan, and Shah (2006), who found a mean of 12.9 (SD =3.29) for 26 female college students. However, based on the 133 people that completed all the items, the scale only had an internal consistency reliability of .593, which is lower than that found by Yamagishi and colleagues. In this study, the RISK scale had an internal consistency of .85. Scores ranged from 10 to 42, with an average of 21.21 (SD=6.6). For the psychological risk questions, scores ranged from 5 to 20 with a mean of 8.51(SD=3.507). For the physical risk questions, scores ranged from 5 to 24, with a mean of 12.70 (SD=4.436). Table 7 shows the summary results for trust and risk measures. Table 7. Descriptive statistics for trust and risk measures Std. N Minimum Maximum Mean Deviation SVO Score 136 7 24 15.02 3.13 Total RISK Score 136 10 42 21.21 6.60 Psychological RISK 136 5 20 8.51 3.51 Physical RISK 136 5 24 12.70 4.44 As part of the dependent variable, preservice teachers were asked on a scale of 1- 5 how much they trusted the technology in each question. Scores ranged from 1 (I don’t trust it at all) in the second part to 5 ( I completely trust it). A composite Trust score was created by getting the average of the scores of the Trust questions -- whether the preservice teacher said they trust the technology or not. Preservice teachers ranged from 2 to 4.25 with a mean of 3.33 (SD=0.44) for the Trust items. 62 Importantly, neither SVO (general trust) scores nor RISK scores predicted trust in technology scores. This further suggests that there is something different about trusting people and having a desire to take personal and physical risks, and trusting technology. These three factors combined, the scale reliabilities, suggestions from the literature, and lack of a relationship between instrument scores, suggest that there is something different about trusting people and trusting technology. Particularly, facets of competence, reliability, and honesty seem to be part of trusting people and trusting technology whereas facets of openness and benevolence do not seem to extend as well] from trusting people to trusting technology. This difference and its implications are further discussed in Chapter 6. Teacher Trust and Reported Usage of Technology The second question posed in this study was whether preservice teacher trust correlated with what that person says they will do in a situation where they encounter digital technology in the classroom. To answer this question, scores on the trust and reported usage scales were computed, analyzed, and compared. As noted earlier, this instrument, consisting of 12 items, with 6 analogous pairs of high and low risk situations, was piloted and found to be reliable. To reiterate, students were asked 1) On a scale of 1- 5, how likely are you to have your students use the particular digital technology in that situation? And 2) On a scale of 1-5 how much do you trust the particular digital technology? The questions were tailored to fit with the specific situation being addressed. Scores ranged from 1, meaning “not likely at all (to use it)” in the first part and “I don’t trust it at all” in the second part to 5, meaning “very likely (to use it)” in the first 63 part and “I completely trust it” in the second part. A composite Reported Usage score was created by getting the average of the scores for the Usage questions — how likely the preservice teacher is to use the technology. A composite Trust score was created getting an average of the scores of the Trust questions -- whether the preservice teacher said they trust the technology or not. Students varied on how likely they would be to use the digital technology presented (Reported Usage), ranging from an average score of 2 to 4.67 with a mean of 3.51 (SD=0.52). Students ranged from 2 to 4.25 with a mean of 3.33 (SD=0.44) for the Trust items. Answering Question Two posed above, there was a significant positive correlation between Reported Usage and Trust, r(136)=.67, p<.05, indicating that as trust increases, reported usage also increases. One could argue that the high correlation is an artifact of the question about trusting the technology following the question about using the technology. However, there were differences in the values that students reported for each type of question. To compute the difference between an individual’s average Reported Usage score and Trust score, the absolute value of the difference was computed. The absolute value of the difference was computed to get a general idea of the differences because sometimes the difference was negative. Differences between the two scales ranged from 0 (2 participants) to 1.17 points, with an average of 0.34 (SD=0.28) point difference. This suggests that even though the Trust scores were correlated with the Reported Usage scores, on average, participants were not simply answering the same thing for each item. 64 Risk Level Analysis The third question posed in this study was: Given a situation that entails some risk, does the level of risk affect whether a preservice teacher says they will trust a digital technology? Does the level of risk affect whether a preservice teacher says they will use a particular digital technology? Before this question was answered, descriptive statistics for high and low risk items were analyzed. Appendix H shows the descriptive statistics for each item response to questions that asked, on a scale of 1-5, with 1 being “I don’t trust it at all” and 5 being “I completely trust it,” how much do you trust the particular technology? Means for trust in high risk items ranged from 2.1 to 3.89. Means for low risk items ranged from 2.08 to 3.98. Appendix I includes descriptive statistics for each response to Reported Usage items. As noted before, these are responses to the question: On a scale of 1-5, with 1 being “not likely at all” and 5 being “very likely”, how likely are you to use that particular digital technology? Overall, means for high risk items ranged from 2.43 to 4.44. Means for low risk items ranged from 2.57 to 4.62. The overlap in scores suggests that something more than risk level was a factor in whether the preservice teacher said she would use the technology or not. Descriptive statistics for each item suggest that some digital technologies were more likely to be used than others, irrespective of the risk level involved. For example, the means for using the intemet for research ranged from 4.44 to 4.62 and the means for using an online submission program ranged from 3.97 to 4.04, whereas using a new web service that tracks information had means from only 2.43 to 2.57. Different digital 65 technologies evoked different responses about whether the preservice teacher said he or she would use it for future teaching. Descriptive statistics also suggested that there were times when participants trust the digital technology but said they would not use it. For instance, participants were likely to trust the DVD in class, with means of 3.89 and 3.98 for high and low risk situations respectively, even though they were not as likely to say they would use it, with means of only 3.32 and 3.33 for high and low risk situations, respectively. On the other hand, when asked about trusting the intemet for research, participants were generally apathetic about whether they actually trusted the intemet (means of 3.51 and 3.81), whereas when asked whether they would use it or not, there were very likely to use it (with means of 4.44 to 4.62). In other words, even though the preservice teachers trusted the DVD set-up, they were not necessarily going to use it; even though the preservice teachers did not necessarily always trust the intemet, they reported that they would use it for research purposes anyway. Returning to Question Three posed in this study about whether risk influences a preservice teacher’s reported trust and usage of digital technology, a Mixed Models approach was used to see if there was 1) a main effect for risk and both Trust and Reported Usage items, and 2) a significant interaction between High and Low Risk Trust and Reported Usage items. There was a main effect for trust, F(1,l35)=-.51, p=.02. Thus, if we ignore all other variables, preservice teachers reported that they trusted classroom digital technology more in low risk situations (20.24) than in high risk situations (19.72). The scores reported here are total scores rather than averages. However, there was no main effect for reported usage in the repeated measures test, 66 F (1,135) = -.22, p=.32. Results of the test suggest that if we ignore all other variables, preservice teachers reported future usage is no different in low risk (21.2) and high risk situations (20.98). The interaction is not significant p=.127. Although the interaction was not significant, the pattern is consistent with the hypothesis that there is a smaller difference between trust and reported usage in the low risk situations than in the high risk situations, as seen in Figure 1. It appears that at least some participants who reported not trusting the digital technology in high risk situations still said they would use the digital technology. 21.5 a» .. ._ . 21 .... .W. 3.1...WWW W o \ ,3 \ —I- - Reported E-t ’ Usage 19.5 19 18.5 . Low Risk High Risk Risk Level Figure 1. Trust and Reported Usage mediated by level of risk in the situation Qualitative Data Analysis To explore why a teacher might say they would use a digital technology, but not trust it, or say they would not use it even though they trusted it, qualitative comments 67 were further explored. After each vignette there was room for participants to make comments about their choices. No comments were required, but many participants wanted to justify their decision. The qualitative data illuminate potential reasons for the difference between the affect of risk level on reported trust versus reported usage. I used an Iterative Refinement Cycle (Lesh & Lehrer, 2000) model to analyze the comments. The first interpretive cycle was used to identify those issues that pertained to general pedagogy and classroom culture. The second cycle was used to identify common reasons for trusting the technology but not using it. The third cycle was used to identify common reasons for not trusting the technology, but still using it. One factor that arose was that sometimes it is good to teach students about things that the teacher may not trust. For example, when asked about using the internet to help students research the civil war, one participant wrote, “I may not trust that all the sites are accurate, but I think it important to educate the students about this risk and teach them to be conscientious in their intemet research.” Another participant wrote, This becomes another teachable moment on what perceptions exist about the war and/or how the intemet is not always the most accurate source for information. Leading to the conclusion, that explains why some colleges have difficulty allowing it as a credible source. The second participant noted, when using the intemet for studying minerals, “the only other issue [as those mentioned above] here is plagiarism.” Nonetheless, using the intemet can be made into a “teachable moment” whether the subject is the civil war or minerals — even if the preservice teacher trusts the intemet more for looking up more information about minerals. 68 A second factor that arose involved limitations of both humans and computers. For example, when asked the question about whether they would have their students turn in a big paper before spring break, one participant wrote, “Although I would like to save the paper needed to print all those essays, reading 30-some essays on a computer would be very hard on my eyes”. In this case, even though they trusted the e-mail, the participant did not want to read the papers online because it would be harder on the eyes than reading it on paper. The same participant wrote of turning in the homework assignments using e-mail, “Since I will be teaching Chemistry, most of my homework assignments will involve math and therefore typing them would not be the best option”. In this case there is a limitation of the computer software available for this type of subject. Therefore, even though the preservice teacher trusted e-mail, the teacher would not use it because of human and computer software limitations. A third factor was related to environmental concerns. For example, when asked the question about whether they would have their students turn in a big paper before spring break, one participant wrote, “I'm really big on helping the environment in any way I can, and submitting papers electronically is a great way to help save paper”. Irrespective of the risk involved, even if the preservice teacher did not trust e-mail to get the paper there safely, he or she would still use it to avoid the paper usage. Finally, even if the preservice teacher says they trust the digital technology more in a low risk situation, they may have curricular reasons for not using it in both the high and low risk situation. For example, when asked whether the preservice teacher would use a DVD in a lesson right before an exam, one participant wrote: 69 I would want the students to learn from what my lesson plans, the movie might teach unnecessary information or leave out parts I may think are important. I wouldn't be worried about whether the dvd would work but I would opt for something more interactive. This person was more concerned about the content of the DVD than the reliability of the DVD. Another student wrote, “I'd rather lecture because it's easier to pause a live lecture than a DVD if there are any questions” and then referred to this answer when asked about using the DVD in a low risk situation. A third participant wrote, “DVD's cant monitor and gauge students understanding!” Even if this preservice teacher trusts the DVD in a low risk situation more than in a high risk situation, this preservice teacher will not use a DVD for instruction because the DVD cannot respond to the students. For a list of all the comments made by the participants, please see Appendix J. What Factors Contribute to Teacher Trust in Technology? The fourth question posed in this study was, how do demographic factors and individual skills (gender, race and ethnicity, age, socioeconomic status, what level the teacher plans to teach, what content area the preservice teacher feels most equipped to teach, and computer self-efficacy); psychological traits (how trusting the person is in general and how risky the person is in general); and experiential factors (experience with technology in classes) of the preservice teacher contribute to that person’s decision to trust the technology? To answer this question, each part of the instrument was assessed. In addition to the general demographics findings discussed in Chapter 4 and the General Trust and Riskiness questions discussed in the previous sections, questions about computer self- 70 efficacy and technology use in college classes were asked. Computer self-efficacy (CSE) involves how confident a person is in his or her computer skills. In a study of CSE using the scale use in this study, Khorrami-Arani (2001) reported an internal consistency reliability of .95. In this study, the reported scores for 28 items were averaged to create a composite CSE score. Like with the original study by Khorrami-Arani, reliability in this study was also .95. For the 135 preservice teachers that completed this part of the survey, scores ranged from 1.71 to 5.18, with a mean of 4.36 (SD = 0.63). This is slightly higher than Khorrami-Arani’s average of 4.1 in secondary students in Australia (SD= 0.64). Overall, though, they are quite similar. An independent samples t-test showed that males (4.72) scored significantly higher than females (4.29) on the CSE self- report (t(133)=3.015, p=.003). This suggests that males were more confident in their computer skills than their female counterparts. These results support those found by Shashaani (1997) that showed that female students were less confident with computers than male students. One place where students might learn to use computers is in an educational setting like the University. When asked, “In how many of your education classes here at MSU were computers used for purposes of instruction (including the use of ANGEL)”, most students responded that at least 3 of their education classes used computers (91%). As seen in Figure 2, more than 40% reported that more than 6 of their education classes thus far have used computers. 71 imorethan6 g, _ i _ i W i _ l _ _, . 55 .32 o 10 20 30 4o 50 60 Figure 2. Preservrce teacher reported use off—computers in education classes.w When asked on a scale of 1-5, with 1 being “consistently reliable,” 3 being “ok,” and 5 being “the technology was always breaking,” how they would rate the general digital technology use in classes taken in the Education department here at MSU, 89% felt the reliability ranged from being “ok” to “fairly reliable”. Only one person that answered the question felt that the technology was not reliable at all in the education classes. In addition to the number of classes that used computers and how reliable the technology was in general, students were also asked in general, with 1 being “very positive” and 5 being “very negative,” how positive their experience has been with the use of digital technology in three different settings, including 1) all classes at the University, 2) courses taken in the Teacher Education Department at MSU, and 3) classes taken in Erickson Hall. Figure 3 shows that preservice teachers’ beliefs about their experiences with digital technology vary considerably. Most preservice teachers have had 72 a somewhat positive experience across all three settings. However, one student reported having very negative experiences with technology in Teacher Education classes. 9O 80 7O 60 50 4O 30 20 IAII MSU lTeacher Education 1:] Erickson Figure 3. Preservice teacher experience with digital technology at MSU Preservice teachers were also asked about their future use of computers. For example, preservice teachers were asked how often they planned to use computers in classes that they will eventually teach. Results varied a great deal. Figure 4 shows that the majority of students say they will use computers either daily or once a month. However, it is interesting to note that four people said they NEVER planned to use computers. 73 Never 4 Once a year 3 t l - Once a semester - 11 Once a month W Daily Figure 4. How often preservice teachers say they will use computers in their future instruction An exploratory regression analysis was used to determine which variables contributed to the variance in reported trust. Because of the exploratory nature of the study, stepwise selection of variables was used to select the final variables in the regression equation. The entry of independent variables into the regression equation was based first on the zero-order correlation coeffiecient. The first predictor had the highest zero-order correlation coefficient (also called Pearson’s r or bivariate correlation). The next predictor was the one that produced the greatest incremental increase in R—squared (squared semipartial correlation), after taking into account the predictor already in the equation. Zero-order and part correlation coefficients are reported in Table 8. Tests were done at each stage that an independent variable was entered into the equation as if it were the last to be entered. Thus, predictors that were considered useful initially could be deleted if no longer considered useful. All variables that were initially considered in the 74 model were theoretically derived (described in Chapter 3, the Logic and Design of Variables). The b coefficients and the constant are used to create the regression equation. The beta coefficients are the standardized regression coefficients. Their relative absolute magnitudes for a given step reflect their relative importance in predicting respondent's trust in digital technology. Adding or subtracting variables in the equation affects the size of the betas, so they can only be compared within a model, not between. The collinearity statistics in this part of the study suggest that the independent variables are not highly intercorrelated. The tolerance for a variable is 1 — R-Squared for the regression of that variable on all the other independent variables, ignoring the dependent variable. When tolerance is close to 0 there is high multicollinearity of that variable with other independents and the b and beta coefficients will be unstable. VIF is the variance inflation factor, which is simply the reciprocal of tolerance. Therefore, when VIF is high there is high multicollinearity and instability of the b and beta coefficients. All of the tolerance and VIF statistics are within an acceptable range. 75 Table 8. Zero-order and part correlation coeflicients with the trust dependent variable Unstandardized Standardized Coefficients Correlations Collin Coefficients earity Statist ics Model B Std. Beta t Sig. Zero- Part Toler VIF Error order ance (Constant) 3.798 0.817 4.647 0 Gender -0.042 0.115 -0.037 -0.369 0.713 -0.028 -0.032 0.737 1.357 Age -0.041 0.031 -0.161 -1.338 0.184 -0.052 -0.115 0.514 1.946 Education 0.083 0.112 0.081 0.74 0.461 —0.044 0.064 0.621 1.61 Race -0.227 0.143 -0.147 -1.587 0.115 -0.127 -0.137 0.864 1.158 (0=minority, 1=white) Mother -0.041 0.033 ~0.121 -l.243 0.217 -0.089 -0.107 0.781 1.28 Education Teacher Level 0.06 0.092 0.07 0.65 0.517 -0.037 0.056 0.635 1.574 1nterest(0 =Arts, -0.048 0.08 -0.054 -0.597 0.552 -0.042 -0.052 0.898 1.113 I: Math/Sciences) Year in College -0.07 0.086 -0.09 -0.817 0.416 -0.107 -0.07 0.619 1.616 General Trust -0.015 0.013 -0.106 -1.145 0.255 -0.091 -0.099 0.863 1.159 Score Riskiness Score -0.007 0.007 -0.098 -0.978 0.33 -0.017 -0.084 0.744 1.345 Computer Self 0.006 0.002 0.257 2.652 0.009 0.213 0.229 0.789 1.267 Efficacy Numberof 0.066 0.03 0.222 2.165 0.033 0.148 0.187 0.705 1.419 Classes Taken in College that Used Digital Technology Valenceof 0.136 0.072 0.226 1.883 0.062 ~0.046 0.162 0.516 1.939 Experience with Digital Technology at MSU Valenceof -0.189 0.082 -0.33 -2.305 0.023 -0.226 -0.199 0.364 2.75 Experience with Digital Technology in Teacher Education classes Valence of 0.048 0.068 0.088 0.708 0.48 -0.124 0.061 0.482 2.073 Experience with Digital Technology in Erickson 76 The t-test tests the significance of each b coefficient. It is important to note though, that it is possible to have a regression model that has a significant F test, but where a particular coefficient is not significant (as is the case here). The part correlation "removes" the effect of the control variable(s) on the independent variable alone. Part correlation is used when one hypothesizes that the control variable affects the independent variable but not the dependent variable and when one wants to assess the unique effect of the independent variable on the dependent variable. Therefore, the Valence of Experience with digital technology in the Teacher Education program here at MSU was entered first into the regression equation because of the high zero-order coefficient. Computer self-efficacy (CSE) was entered into the equation in the second block because of the high part coefficient. Table 9 shows the final model for preservice teacher trust in educational technology. Table 9. Summary of Hierarchical Regression Analysis for Variables Predicting Preservice teacher Trust in Classroom Digital Technology Summary of Hierarchical Regression Analysis for Variables Predicting Preservice teacher Trust in Classroom Digital Technology (N=136) Variable B SE B 13 Step 1 Valence of Experience with Digital Technology -. 12 0.05 -0.22 in Teacher Education Classes Step 2 Valence of Experience with Digital Technology -11 0.05 -0.20 in Teacher Education Classes Computer Self Efficacy .01 0.00 0.20 Note. R-squared = .05 for Step 1 (p<.05): Change in R-squared = .04 for Step 2 (p< .05) 77 The model was then tested for both the high and low risk items (rather than all the items together) because there was a difference in how much they trust the technology in high and low risk situations. As can be seen in Table 10, CSE was a significant predictor (at the .05 level) for high risk items. Table 10. Summary of Hierarchical Regression Analysis for Variables Predicting Preservice teacher Trust in Classroom Digital Technology in High Risk Situations Summary of Hierarchical Regression Analysis for Variables Predicting Preservice teacher Trust in Classroom Digital Technology in High Risk Situations (N=135) Variable B SE B 13 Step 1 Valence of Experience with Digital Technology in Teacher Education Classes -0. 12 0.05 -0.20 Step 2 Valence of Experience with Digital Technology in Teacher Education Classes -0. 12 0.05 -0. 18 Computer Self Efficacy 0.00 0.00 0.18 Note. R-squared = .04 for Step 1 (p<.05): Change in R-squared = .03 for Step 2 (p< .05) CSE was also a significant predictor for low risk items, as seen in Table 11. 78 Table 11. Summary of Hierarchical Regression Analysis for Variables Predicting Preservice teacher Trust in Classroom Digital Technology in Low Risk Situations Summary of Hierarchical Regression Analysis for Variables Predicting Preservice teacher Trust in Classroom Digital Technology in Low Risk Situations (N=135) Variable B SE B [3 Step 1 Valence of Experience with Digital Technology in Teacher Education Classes -0. l 2 0.281 -0.222 Step 2 Valence of Experience with Digital Technology in Teacher Education Classes -0.11 0.05 -0.20 Computer Self Efficacy 0.00 0.00 0.21 Note. R-squared = .05 for Step 1 (p<.05): Change in R-squared = .04 for Step 2 (p< .05) According to this dataset, the variables that seem to influence whether a preservice teacher says they trust digital technology or not are those that can be influenced — experience with technology in Teacher Education classes and computer self- efficacy. In particular, the more positive a student’s experience with digital technology in their TE classes, the more likely they are to say they trust it. The more computer savvy the person believes he or she is, the more likely that person is to say they trust the digital technology. Neither psychological factors such as general trust levels and riskiness nor demographic traits such as gender, age, and SES seemed to play a role in influencing the preservice teachers’ decisions about saying they trust the digital technology. What Factors Contribute to Teacher Reported Usage of Technology? In addition to trusting the digital technology, Question Five asked 1) To what extent and in what ways do the explanatory variables for trusting a digital technology also explain what a preservice teacher says he or she will do? And 2) What contributes to a 79 preservice teacher’s decision to say they will actually use the digital technology? In the models of trusting digital technology, the two key variables seemed to be valence of experience in teacher education classes and computer self-efficacy. In this section, the same procedure was followed as in building the trust model to see what a model of use looks like. The difference is that in addition to the variables used in the trust model, trust was also used as an independent variable. Table 12 includes the zero-order and part correlations with the Reported Usage dependent variable. 80 Table 12. Zero-order and part correlation coeflicients with the reported usage dependent variable Unstandard ized Standardized Coefficients Correlations Colline Coefficients arity Statisti cs Model B Std. Beta t Sig. Zero- Part Toler VIF Error order ance (Constant) -0.887 0.754 - 0.242 1.176 Trust Level 0.788 0.081 0.697 9.725 0 0.661 0.625 0.803 1.245 Gender 0.029 0.097 0.023 0.301 0.764 -0.053 0.019 0.736 1.359 Age 0.068 0.026 0.235 2.596 0.01 1 0.109 0.167 0.505 1.979 Education -0.048 0.094 - - 0.61 0.053 - 0.618 1.618 0.042 0.511 0.033 Race (0=minority, 0.365 0.122 0.21 2.996 0.003 0.067 0.193 0.844 1.185 1=white) Mother Education 0.063 0.028 0.166 2.27 0.025 0.01 1 0.146 0.77 1.298 Teacher Level 0.016 0.078 0.017 0.21 0.834 0.006 0.013 0.633 1.58 Interest(0=Arts, 0.056 0.068 0.056 0.83 0.408 -0.013 0.053 0.895 1.117 1=Math/Science) Year in College -0.009 0.072 -0.01 0.899 -0.04 0.615 1.626 0.128 0.008 General Trust 0.011 0.011 0.07 1.006 0.317 -0.065 0.065 0.853 1.173 Score Riskiness Score 0.007 0.006 0.089 1.191 0.236 0.05 0.077 0.737 1.357 Computer Self —0.001 0.002 0.613 0.184 0.741 1.35 Efficacy 0.038 0.508 0.033 Number of Classes -0.02 0.026 - — Taken in College 0.058 0.745 that Used Digital Technology 0.458 0.078 0.675 1.481 0.048 Valenceof 0.05 0.062 0.074 0.813 0.418 -0.058 0.052 0.499 2.003 Experience with Digital Technology at MSU Valence of -0.069 0.071 - Experience with 0.106 Digital Technology in Teacher Education classes -0.97 0.334 -0.269 0.347 2.885 0.062 Valence of -0.108 0.057 - - 0.061 -0.27 - 0.48 2.083 Experience with 0.176 1.894 0.122 Digital Technologyin Erickson 81 Because of the high zero—order correlation, the first variable entered into the equation was trust ratings. After trust ratings were entered, five variables seemed to stand out as possible predictors: age, ethnicity, mother education (proxy for SES), and the valence of experience in TE classes and in Erickson Hall. Ethnicity was not significant. Mother’s education was not significant above and beyond trust. Age was significant beyond trust, but there were only three students that were not between the ages of 19 and 26. When those three students were not used in the analysis the relationship disappeared. Therefore, it seemed likely that the relationship, which suggested that the older the student the MORE likely they were to say they would use digital technology, was an artifact of those three students and was not reliable. Both the valence of experience in TE classes and valence of experience in classes in Erickson were significant alone, but depending on the order in which they were entered next, the significance varied. Teacher Education valence was chosen to be included in the model so that further path analyses could be performed on the three key variables: preservice teacher saying they would use/not use the digital technology, preservice teacher saying they would trust/not trust the digital technology, and the valence of their experience in their teacher education program. Like with trust, none of the psychological variables and only one of the individual factors (gender, age, SES) seemed to explain variance in what participants said they would do with the digital technology. Table 13 shows the regression model including Trust Level and Valence of experience in TE classes predicting preservice teacher saying they would use the classroom digital technology. 82 Table 13. Summary of Hierarchical Regression Analysis for Variables Predicting Preservice teacher Use of Classroom Digital Technology Summary of Hierarchical Regression Analysis for Variables Predicting Preservice teacher Use of Classroom Digital Technology (N=136) Variable B SE B 13 Step 1 Trust 0.79 0.08 0.70 Step 2 Trust 0.76 0.08 0.64 Valence of Experience with Digital Technology in Teacher Education Classes -.09 0.04 -0.14 Note. R—squared = .45 for Step 1 (p<.05): Change in R-squared = .02 for Step 2 (p< .05) Path Analysis: Trust, Teacher Education, and Reported Usage It may be that the valence of experience in teacher education classes leads to teacher trust in technology, which then leads to teachers saying they would use technology, or it may be that the valence of their experience independently affects variance in preservice teacher use of technology. A path analysis using AMOS was done to compare the two possible routes to the full model that includes both paths (both direct and indirect routes). A conceptual difference of SEM from regression is that in a regression model the independent variables are themselves correlated (multi-colinearity) which influences the size of the coefficients found. In SEM the interactions amongst these variables are modeled. In the regression models above, the independent variable in one relationship, Trust for example, becomes a 83 dependent model in other relationships, like when assessing reported usage. Regression cannot handle this very well and would require the use of hierarchical regression. Briefly, the x2 value is a measure of the difference between what the actual relationships in the sample are and what would be expected if the model were assumed correct. A large difference would indicate that the model does not fit. Since the x distribution is different for different degrees of freedom, the x2 value must be interpreted in terms of its degrees of freedom, by computing a ratio of the x2/DF. A model that represented the sample data well would yield a ratio close to 1. The Root mean square error of approximation (RMSEA) is related to the difference in the sample data and what would be expected if the model were assumed correct. Because it is a model error term lower values indicate a better fit. The Just-Identified Model (or Full or Saturated) is the model in which there is a direct path (not through an intervening variable) from each variable to each other variable. Model A is the full model, with both direct and indirect effects. When you delete one or more of the paths you obtain an overidentified model. A nonsignificant Chi-square would indicate that the fit between our overidentified model (or simplified model) and the data is not significantly worse than the fit between the just-identified model and the data. Below are the saturated (Model A in Figure 5) and two overidentified models (Models B and C) being compared in this paper. Model B illustrated in Figure 6 hypothesizes an indirect relationship, mediated by trust, from TE Experience to Reported Usage. Model C illustrated in figure 6, hypothesizes a direct relationship between the TE experience and Reported Usage. 84 Table 14 shows the mean, standard deviation and correlation matrix of the three observed variables. Once again, a lower value on the Valence of Experience with Technology indicates a more positive experience whereas a higher value indicates a more negative experience. Thus the negative correlation between Reported Usage and TE Experience indicates that there is a relationship between the two variable where more positive experiences are related to more reported usage. The Maximum likelihood method for estimating the models and their respective parameters are shown in Figure 8. Table 15 shows the covariance matrix for the three variables. The path coefficient between trust and reported usage is statistically significant (B=.64, t=9.835) and the path coefficient between TE Experience and reported usage is also significant (B=.14, t=- 2.14). The indirect effect of TE experience is -. l 39 and the direct effect is -.139 for a total causal effect (-.139 +-.139) of -.278. The direct effect of Trust on Reported Usage is .639. Table 14. Correlation matrix, mean values, and standard deviation of observed variables (n=136) Variables Mean SD 1 2 3 1. Reported Future Technology Usage 3.51 .52 l 2. Trust in Technology 3.33 .44 .70“ l 3. Valence of Experience with Technology in TE classes 2.08 0.79 -.28* -.22* 1 * p<0.05 Table 15. Covariance matrix of observed variables Variables 1 2 3 1. Reported Future Technology Usage .27 2. Trust in Technology .15 .19 3. Valence of Exgerience with Technology in TE classes -.12 -.08 .64 85 .05 Trust in Educational .64 Technology 47 4 Reported Usage -.22 -. 14 TE Experience Figure 5. Path analysis results of Saturated Model (Model A) with both an indirect relationship between TE experience and Reported Usage and a direct relationship between TE experience and Reported Usage. Note: X2=0.00, DF=0. Numbers represent standardized coefficients. All path coefficients were significant at the .05 level. As noted earlier, it could be that a student’s experience with technology in their teacher education classes influences trust, which then influences whether that student says they will use the digital technology in the future or not. Therefore, Model B hypothesized an indirect relationship between TE experience and reported usage mediated by trust. The results indicated a X2 value of 4.6 with 1 degree of freedom (p= 86 0.033). The X2 ratio is significantly different than the full model, suggesting that this model is not as good as the full model. .05 Trust in Educational .67 Technology 45 A Reported Usage -.22 TE Experience Figure 6. Path analysis results of Model B with an indirect relationship between TE experience and Reported Usage. Note: X2=4.57, DF=1, p<0.05, GFI=0.98, NFI=0.95, RMSEA=0.16, e=error. Numbers represent standardized coefficients. All path coefficients were significant at the .05 level. 87 .05 Trust in Educational .65 Technology 44 Reported Usage -.14 TE Experience Figure 7. Path analysis results of Model C with a direct relationship between TE experience and Reported Usage. Note: X2=6.59, DF=1, p<0.05, GFI=0.97, NFI=0.93, RMSEA=0.20, e=error. Numbers represent standardized coefficients. All path coefficients were significant at the .05 level. Model C hypothesized a direct relationship between TE Experience and Reported Gain, without having any relationship with Trust. The significant X2 suggests the model does not fit as well as the full model. Model B, which hypothesized an indirect relationship between the valence of preservice teachers experience with digital technology with reported usage fit the data the better than Model C, but the model was still not as good as the saturated model. Overall, the findings suggest that there are both direct and indirect paths between preservice 88 teachers’ experience with digital technology and reported future usage. The full model is the best model for the data in this study. Conclusions In this chapter several results were discussed. Descriptive analyses of preservice teacher general trust, risk level, computer self-efficacy, experience with technology in classes, trust in classroom technology, and reported future usage of educational technology were described. Based on scale reliabilities and a lack of a correlation between general trust scores and trust in educational technology scores, it appears that trusting people is conceptually somewhat different than trusting technology. Trusting technology did correlate with reported future usage of educational technology. Qualitative data suggested that the reasons for teachers using or not using particular digital technology were technology specific, but not based on the risk level entailed in the situation. On the other hand, trusting the digital technology seemed to be based both on the specific affordances of the technology and on the risk level of the situation. Regression analyses found that in addition to how much the preservice teachers trust the digital technology, preservice teachers’ experience with digital technology in their teacher education classes plays a major role in determining what they say they will do with technology when they become a teacher. Finally, a path analysis found that how positive a participant’s experience with technology in the TE classes was, had both direct and indirect effects (through trust) on reported usage. In the following chapter I discuss the implications of these results, limitations of this study, and directions for future study. 89 CHAPTER SIX Conclusions and A Look to the Future In the previous chapter I discussed how there did appear to be a relationship between trusting technology and reported usage of technology and that there were several key factors that played an important role in this relationship. First, results suggest that the reasons for teachers using or not using particular digital technology and for trusting technology were technology specific. Whereas there was no relationship between the risk level of the situation and reported usage, participants were more likely to say they trusted the technology in situations entailing low risk than entailing high risk. Regression analyses found that in addition to how much the preservice teachers trust the digital technology, preservice teachers’ experience with digital technology in their teacher education classes plays a major role in determining what they say they will do with technology when they become a teacher. Finally, based on scale reliabilities, a lack of a correlation between general trust scores and trust in educational technology scores, and cohesiveness with previous literature and findings, it appears that trusting people is both conceptually similar and different than trusting technology. They are similar in that it appears that both relationships involve reliability, competence, and honesty. However, trusting humans involves facets of openness and benevolence, neither of which appear to play a role in trusting technology. Based on the results of this study, there are several key contributions that will be discussed in this chapter. First, I learned many lessons from instrument construction. When developing an instrument, it is important to think through the validity of each item and that the semantics of the instrument items must be considered when understanding 90 the results of the study. Second, the finding that risk level and pedagogical philosophy affect trust in digital technology but not reported future usage suggests that trust is better viewed as a belief than a behavior and that there are other factors besides the trust belief that impact a preservice teacher’s behavior. Third, more than any other factor tested in this study, experience with technology in teacher education classes influenced both the trust in and the reported future usage of educational technology. This is consistent with previous research, such as work by Handler (1993). Handler found that of those teachers who felt prepared to teach with technology, important factors that led to this competence were course work in educational technology, how much computers were integrated in methods classes, and the observation and use of computers during student teaching experience. It becomes important, then, for teacher educators to understand how they can create a positive experience with educational technology for preservice teachers. Only then can decisions about using technology be pedagogically and situationally based rather than because the teacher does not trust the technology. Fourth, this study started with trying to understand how preservice teachers understand technology — as an instructional tool or as more of a social actor. The finding that preservice teachers tend to extend only the facets of reliability, competence, and honesty (not openness and benevolence) to conceptualizing the trustworthiness of technology suggest that, conceptually, trusting humans is somewhat different than trusting technology. This new trust in technology framework may give researchers a way to understand teacher technology use in future studies. Finally, I conclude this chapter by examining the limitations of the study and offering directions for future research. 91 Lessons from Instrument Construction A key lesson from this study is the importance of construct validity in instrument items. Construct validity is the extent to which items measuring a particular construct actually measure what the theory says it measures. When trying to develop an instrument to measure teacher trust in technology, I had to make numerous decisions when trying to measure the construct of preservice teacher trust in digital technology. Initially, I wanted to mimic attitudes and beliefs surveys about technology and intimate and organizational trust surveys that had already been developed. As noted in Chapter 3, there are scales that look at teacher attitudes and beliefs toward computers and scales that assess trust in individuals (intimate) and organizations (organizational). In Mueller and colleagues (2008) survey about understanding technology, participants were asked to what extent they agreed with statements such as, “I see computers as tools that can complement my teaching” (instructional tool) and “I use computers to motivate my students” (motivational tool). However, this type of scale did not get at issues related to trust, such as whether the teacher thought the content to be competent or the technology to be reliable. I also considered extending items from Hoy and Tschannen-Moran’s (2003) Omnibus T-scale and simply replacing the human component with technological components. In their organizational trust scale, participants reported how much they agreed with statements like “Teachers in this school generally look out for each other” and “The principal in this school typically acts in the best interest of the teachers.” One could imagine a survey that included items like, “Computers typically act in the best interest of the students.” However, in both the attitudes and beliefs scale and the trust 92 scale, Hardin’s (2000) third property of trust, related to context, was not apparent. The classroom is a rich and complex environment in which many external variables, like time pressure or social pressures, can affect a teacher’s decision to use a particular pedagogical tool or not. Thus, I decided that making explicit the context of the situation, including the subject being taught and various social pressures that may or may not be present, needed to be included in the items. Without the specific context, participants might be thinking very different things when answering the questions and the instrument would not be valid. Rather than creating items like those seen in scales about attitudes and beliefs about technology, interpersonal trust scales, and organizational trust, I used Tschannen- Moran and Hoy’s (2000) five facets of trust as a guide to create vignettes in which a teacher might have to decide whether they would 1) use the digital technology and 2) trust the digital technology. In addition to deciding whether to use one-sentence items without context, I had to decide what kinds of technologies I should include in the items. As noted in the introduction, technology is not monolithic. One could imagine trust in hardware being different or similar to trust in software. And this kind of trust could be quite different than trust in others in forms of synchronous and asynchronous forms of communication. Further, some facets of trust are more related to some categories of technology than others. For example, reliability might be more important for hardware issues than for Internet communication issues. But when I decided to use vignettes, which are longer and arguably harder to think about than simple statements, I needed either to limit the type of technology and base the items on facets of trust or to choose to start with several kinds of technology. Since I was interested in trying to understand whether the situation would 93 influence whether a teacher would trust the technology and whether trusting that particular technology would predict the preservice teacher saying she would use the technology, I chose to focus on creating the items based on facets of trust rather than types of technology. While I was not able to identify which facets of trust are specifically related to which types of technology, the use of vignettes was beneficial in that the context did matter when thinking about trusting technology. The items in this study were reliable over time, and results from both the computer self-efficacy survey and an item about future computer use support the idea that vignettes may lead to very different results than those found fiom surveys that simply use statements. Results of the regression models on trust found that the more self-efficient and confident participants claimed to be with computers, the more likely they were to say they would trust the digital technology. On the other hand, computer self-efficacy did not seem to play a role in reported usage above and beyond the variance explained by trust and the valence of experience with digital technology in teacher education classes. Further, there was no significant relationship between how often the preservice teacher planned on using computers in their teaching and how likely they were to say they would use the specific digital technology in the specific situations posed in this study. When only posed with one question about computer use, the students may have blindly thought they would use computers a particular amount of time. However, when students were actually presented with a possible situation with social or curricular constraints, they backed off from saying they would use the digital technology, suggesting that the surveys about future computer use 94 may be more accurate if specific situations are used rather than general questions about use. The Importance of Risk Level and Pedagogical Philosophy when Conceptualizing Trust as a Behavior Versus Trust as a Belief As noted in Chapter 2, trust has been conceptualized as both a belief and a behavior. In this study, risk level influenced the trust belief, but not the reported behavior. On the other hand, participants pedagogical philosophy tended to influence the reported behavior of using technology or not, but not necessarily the trust belief. In this section I will explain how this difference suggests the importance of choosing to view trust as a belief in studies of teacher trust in educational technology. If trust had been viewed as a behavior, it would appear that risk level did not influence trust and that pedagogical philosophy did affect the trust behavior, when this is not qualitatively accurate. In this study, I chose to conceptualize trust as a belief because is seemed that other factors besides trust might impact a teacher’s decision to use educational technology or not. Indeed, results found that while the trust belief and reported behavior were highly correlated, the trust belief did not always match the reported behavior. Support for the view that beliefs and behaviors do not always match due to other factors comes from Ertmer, Gopalakrishnan, and Ross (2001) finding that teachers’ beliefs about classroom technology use did not always match their classroom practices. In this study, the preservice teachers reported action did not always match whether they trusted that technology, particularly in situations that entailed a great deal of risk. Ertmer and colleagues (2001) noted teachers’ explanations for the inconsistencies often included references to contextual constraints such as curricular requirements or social pressure 95 exerted by parent, peers, or administrators. In this study, the contextual constraints were reflected in the risk level of the items. High risk items included situations where there was pressure from administrators or where there was a curricular requirement of some sort, like an upcoming test. In this study, risk level differentially affected the trust belief and the reported behavior. In particular, participants were more likely to say they trusted the technology in situations entailing low risk than high risk, but this risk level influence disappeared when asked whether the participant would actually use the technology. The difference between trust and what they said they would actually do differed the greatest under conditions in which social pressure and contextual constraints were present. In high risk situations, some preservice teachers said that they would use the digital technology even when they did not trust it. Additionally, some participants said they trusted the digital technology, but would not necessarily use it. Often these differences between trust belief and reported behavior in high risk situations arose because of participants’ pedagogical philosophies. One could argue that a teacher who continued to use an instrument of technology she did not trust would be acting in bad faith. However, it is important to understand that the specific technology and the teacher’s pedagogical goals played a large role in that decision. For instance, when presented with the question about Wikipedia, some participants said that even though they did not trust it, they would use it and teach students about the dangers of using Wikipedia. On the other hand, when asked about the DVD, students often said they would trust it, but would not use it because they are opposed to the DVD doing the instruction. The behavior was not related to trust, rather the decision was based on a practical philosophy about instruction. The preservice teachers pedagogical philosophy 96 seemed to be quite important when considering using the technology or not, but not as important when considering a belief about the technology’s trustworthiness. The influence of risk level on belief and the influence of pedagogical philosophy on behavior suggest that teacher trust in digital technology must be viewed as a belief that informs the teacher in their decision to use or not use the technology in different contexts. Creating Positive Experiences with Technology in Teacher Preparation Courses This study supports the idea that preservice teachers’ reported usage and trust in digital technology are directly related to their experiences with digital technology in their teacher education classes. Qualitative comments further supported this finding. For example, a student in this study wrote, “In my college classes my teachers have had trouble getting their dvds to work. I know better than to completely trust technology. I would try the DVD but have a backup plan just in case it didn't work.” In this situation, the student has observed how trusting the DVD to work can lead to problems, which is why the student reports having a plan in case trust in the technology is violated. This is an especially important finding in light of the idea that teachers fall back on instructional experiences of their childhood, not on what occurs in teacher preparation courses. For example, Lortie (1975) notes that preservice teachers arrive in their teacher education courses after having observed their own teachers for thousands of hours. Preservice teachers enter the profession with preconceived notions of instruction. Further, these notions are “intuitive and imitative rather than explicit and analytical” (p. 62). They are “intuitive” in that the the preservice teacher’s notions about teaching are often largely unanalyzed and thus seem based on intuition. They are “imitative” in that preservice teachers copy the behavior modeled in their own classes while growing up. 97 Students fall back on the observed set of strategies from their childhood in times of indecision or uncertainty (Tomlinson, 1999). My study suggests that teacher decisions about educational technology may not follow the same pattern as that mentioned by Lortie (1975). Here, the valence of experience, not just the quantity of experience, with technology in teacher preparation classes does influence what preservice teachers plan to do in the classroom with technology. While my study did not specifically investigate what constitutes a positive experience with digital technology in teacher education classes, other literature about knowledge of technology and the relationship between trust and experience with technology begins to reveal how teacher educators can create a more positive experience for preservice teachers. First, some literature suggests that more knowledge about educational computing in particular improves attitudes toward using technology for the classroom. Willis and Mehlinger (1996) found that completion of a course on educational computing improved attitudes toward technology in the classroom for inservice teachers and preservice teachers. This also supports Zhao’s (2007) finding that most teachers that were willing to use technology expressed positive experiences with technology integration training. However, Zhao’s study was a qualitative research project that investigated the perspectives and experiences of 17 practicing social studies teachers, whereas this study examined the influence of the cumulative experiences with technology in teacher education on preservice teachers’ beliefs about what they would do when faced with specific situations. Like Handler (1993), the study in this dissertation suggests that it is important that training with educational technology be integrated into teacher education methodology 98 courses. In this study computer self-efficacy alone did not contribute to future reported usage of educational technology. Further, positive experiences with technology in teacher education classes were predictive of reported future, but not positive experiences with technology in other college courses. As noted by Byrum and Cashman (1993), "the focus of training usually centers on the mechanics of running the computer and using basic tools and applications rather than integration into the curriculum" (Byrum & Cashman, 1993, p. 260). To create a positive experience with educational technology, the focus of training must involve integrating the technology into more specific pedagogical contexts rather than simply giving teachers general knowledge of computers (Oliver,1994; Monaghan, 1993; Dunn & Ridgway, 1991). In addition to integrated training about educational technology, teacher educators can attempt to create a more positive experience with technolOgy by promoting a trusting relationship with technology. As noted by the results, trusting technology does not automatically mean a teacher will use the technology, but rather, the teacher can then turn to pedagogical and situation-specific philosophies after trust is established. To establish a trusting relationship, we must understand how trust or distrust in technology evolves. Like defining trust, there are different ways to conceptualize the evolution or development of trust. One way to understand the development and dissolution of trust is through a symbolic interactionist perspective (Blumer, 1962; Mead, 1934). Symbolic interactionism is based on the idea that people act toward other people and other things based on the meanings people have learned to associate with them. Those meanings are developed and modified over time through social interactions. Mead (1934) suggests that 99 a person takes the perspective of the other party to understand the other party’s expectations, needs, and goals. In other words, a person takes the intentional stance (Dennett, 1996), to decide how to adjust behavior. Sitkin and Roth (1993) suggest that shared values and expectations and values lead to the experience of trust, whereas distrust arises when there is incongruence of values, but nothing about expectations. George and Jones (1998), on the other hand, suggest that trust and distrust are different levels of the same construct, rather than two different constructs. Further, George and Jones view trust as a dynamic experience that can move from trust to distrust based on social interactions. Like George and Jones, my study also assumes that trust and distrust in technology are different ends of the continuum of the same construct. Further, the trust in technology results suggest that when expectations are not met (reliability and competence), the preservice teachers were less likely to trust the digital technology, an outcome that is different than Sitkin and Roth’s (1993) finding. A second assumption that must be addressed before discussing the evolution of trust and positive experiences with technology is whether people begin by trusting or not trusting technology. In other words, before ever experiencing any technology at all, does the person automatically trust or distrust it? I use the approach from George and Jones (1998), who suggest that the truster “suspends belief that the other is not trustworthy and behaves as if the other has similar values and can be trusted” (p. 535) because it is easier than taking on the mentally taxing intentional stance of figuring out the other party’s (technology) motives. When the technology could pose a threat, the truster must then take on that intentional stance and decide whether to trust the technology or not. 100 In addition to trust and distrust being on the same continuum, George and Jones (1998) also suggest that trust can be further divided into conditional and unconditional trust. Conditional trust exists when both parties are willing to work together as long as they both behave in a trustworthy manner. For the technology to be considered trustworthy, the technology must be reliable, competent, and honest depending on the intended use. Jones and George (1998) write that unconditional trust occurs when: individuals abandon the ‘pretense’ of suspending belief, because shared values not structure the social situation and become the primary vehicle through which those individuals experience trust. With unconditional trust each party’ 5 trustworthiness is now assured, based on the confidence in the other’s values that is backed up by empirical evidence derived from repeated behavioral interactions — knowledge of which is contained in each individual’s attitude toward the other (Butler, 1983). Also, positive affect increases as positive moods and emotions strengthen the affective bonds between parties and bolster the experience of trust. (p. 537) In this study, like in Jones and George’s (1998) model, preservice teachers seemed to show more unconditional trust than conditional trust. In particular, participants with more positive moods and emotions toward technology, may have arisen from positive experiences with technology in teacher education classes. Other teachers who did not have as positive an experience could be viewed as having conditional trust or even distrust. 101 Part of developing a trusting relationship is through understanding that sometimes the trustee may fail to meet the truster’s expectations. The DVD might not work (reliability), the DVD might not have the specific information we thought it was going to cover (competence), and the creators of the DVD might have told us the DVD did cover material we thought it was going to cover (honesty). For teachers who exhibit conditional trust, the failure to meet expectations may lead to distrust, whereas teachers who exhibit unconditional trust are more likely to forgive the failure. To promote unconditional trust in technology, as is conceptualized here, teacher educators may need to help a fiiture teachers understand the technology being used as a social entity. It is important to note here that unconditional trust in technology is not the same as being technologically dependent. When viewing trust as a belief, as is done in this study, rather than a behavior, even with unconditional trust in technology, a teacher can still choose not to use a digital technology for pedagogical reasons. Trustworthiness of People versus Trustworthiness of Technology This project started with an interest in how preservice teachers’ attitude toward and understanding of educational technology might influence when and how they think they will use it in future classes. Trust became a useful construct when exploring preservice teachers understanding of technology. While one could argue that trust in the technology would be the act of using the technology, it was clear in this study that the belief about the trustworthiness of the technology and how much the preservice teachers said they trusted the technology was not identical to the behavior of trusting the technology. There are many reasons for using or not using educational technology, and 102 the preservice teachers’ level of trust in the technology was only one factor that contributed to reported usage. My findings are consistent with Tshannen-Moran and Hoy (2000) suggestion that in certain respects the kind of trust in objects is different from the kind of trust we have in people. However, there are my results do indicate that there are certain facets that do extend from trusting people to trusting technology. As noted in Chapter 2, one may have confidence in her alarm clock, but when the alarm clock does not go off on time, one does feel that trust has been betrayed by the alarm clock. Even if we cognitively know the alarm clock does not have intentionality, we may yell at that alarm clock. When we yell at the alarm clock we are behaving as if the clock has intentionality. When deciding to trust educational technology, a teacher is putting faith in that technology to be reliable, competent, and honest. If the technology proves to be untrustworthy, then the teacher has wasted valuable time and the students have not learned anything. In this sense, while educational technology could be seen as a powerful instructional tool, results from this study suggest that preservice teachers also understand technology as a potential threat and under certain conditions, report behaving toward it as a social actor. Technology is becoming more prevalent in life every day. And each day technology becomes more and more life-like. In a future, where technology will be further integrated into our education system, it will be important for teacher educators to continue to understand the changing relationship with technology. Trust is one way to conceptualize this relationship; trust gives a framework for seeing how changes may occur. For example, as particular technologies become more reliable, perhaps teachers can focus on competence of the technology for a particular subject. Or perhaps openness 103 and benevolence will become more important as facets of trustworthiness of the technology. Nonetheless, a framework that conceptualizes trust in technology as a person’s willingness to be vulnerable to the technology based on its reliability, competence, and honesty is a way for researchers to begin to think about why people may choose to use or not use technology. It can also offer suggestions to designers of educational technology to develop tools that “imply” these characteristics (of reliability, competence and honesty) through their design. Limitations and Future Study It is difficult to calculate at the outset of a research project what limitations one might encounter. Despite my best efforts to maximize the effect of this study and minimize the potential problems, certain limiting factors became evident. These limitations, however, give rise to further exploration and research. A key limitation in this study is that students were asked what they believed they would do in the classroom when they became a teacher; they were not actually observed while teaching. It may be that their intentions diverge from what they end up doing under very similar conditions as those presented in the vignettes. This limitation was insurmountable given the timetable and scope of this research project, but it would be fascinating to track students as they transition from preservice teachers to new teachers. A second potential limitation is that these students are preservice teachers, not practicing teachers. Current teachers, who may be from a different generation than the preservice teachers in this study, probably did not even have any experiences with digital technologies of the type common today when they were preservice teachers. Over 95% of the students of this sample were within the age range of 19 to 25. These students, all 104 born in the years following 1982, have grown up in an age in which digital technologies were abundant and available. This generation, often called the Millennials, uses technology at higher rates than previous generations (Junco & Mastrodicasa, 2007). According to a report released by the U. S. Department of Education, NCES (2000), novice teachers were more likely to use computers or the Internet to accomplish various teaching objectives. Further, teachers with at most nine years of teaching experience were more likely to use digital technology for communication with colleagues than teachers with 20 or more years of experience. As this new technologically-savvy generation proceeds into the classroom, it will be interesting to see if the modeling with digital technology that is occurring in teacher education classes today is actually affecting the use of digital technology for instructional purposes. Do these students choose digital technology over face-to-face instruction because they are dependent on the technology or because the technology affords the students a unique learning opportunity or a timesaving ability for the teacher? Further, it will be interesting to see if these Millennial teachers will follow in the footsteps of the older, more experienced teachers in schools or if they will form their own path. More research needs to be done examining how existing teachers, specifically those who were not trained with digital technology as preservice teachers, trust or distrust classroom digital technologies that have entered education during the course of their careers. As noted earlier in this chapter, a limitation of this study is the specificity of the situations in the created vignettes. These situations were based on facets of trust, not on all the possible situations where a teacher might decide to use digital technology for instructional purposes. Second the situations involving digital technology in this study 105 were not necessarily linked to the use of digital technology for meaningful learning. The situations were very basic situations that teachers will likely encounter rather than specifically designed uses of technology for a specific content area Uses of technology for specific content areas are suggested by the Technological Pedagogical Content Knowledge (TPACK) framework proposed by Mishra and Koehler (2006) called Mishra and Koehler built on the concept of Pedagogical Content Knowledge (PCK), a term coined by Shulman (1986) to describe the type of knowledge required for teaching. PCK, according to Shulman, “represents the blending of content and pedagogy into an understanding of how particular topics, problems, or issues are organized, represented, and adapted to the diverse interests and abilities of learners, and presented for instruction” (p. 8). Ultimately, the TPACK framework helps illuminate the complexity of teaching, describing the problems teachers face in helping their students learn as ill- structured problems (Koehler & Mishra, 2008). Teachers must plan within the social and curricular constraints placed on them, and to solve such ill-structured problems, teachers adapt plans and resources to fit their particular setting. In this study, the situations, while somewhat representative of situations that preservice teachers will likely encounter, did not fully encapsulate the complex environment that the students will face when they go into the classroom. Nonetheless, as one student wrote in the open comments at the end of the survey, “Thanks, this survey wasn't bad. It really helped me think about technology and if I would use it in the future.” Related to how positive the students’ experience was with digital technology, it is important to further investigate what makes a student’s experience positive or negative. Is it related specifically to the individual teacher? Is it related to the teacher education 106 course? Or is it related to the technology itself? Finally, as I have already started to speculate about, how can teacher educators create a positive experience with digital technology for preservice teachers? Finally, F ukuyarna (1995) notes that there are high and low trust societies. For example, France and Italy are considered low trust societies whereas the US, Japan, and Germany are considered high trust societies. Fukuyama writes that the reason that the United States, Japan, and Germany were the first countries to develop large, modern, rationally organized, professionally managed corporations was that: Each of these cultures had certain characteristics that allowed business organizations to move beyond the family rather rapidly and to create a variety of new, voluntary social groups that were not based on kinship. They were able to do so . . . because in each of these societies there was a high degree of trust between individuals who were not related to one another, and hence a solid basis for social capital. (p. 57) Perhaps trust in digital technology is simply an extension of trust beyond relatives, and we would see that teachers in the US, Japan, and Germany would be more likely to trust digital technology, and thus use it, than counterparts in France and Germany. The US. preservice teachers in this study felt that honesty and reliability were the two most important factors when deciding to trust another person. Cross-cultural studies of trust in digital technology may illuminate differences in facets of trust and further identify the role of technology as a social actor contributing to social capital. Trust is an essential element in any productive society. In an increasingly virtual world, we must now trust technology in many ways. Digital technologies have changed 107 our approach to knowledge, and it is essential that researchers understand more precisely why some teachers choose to take advantage of this powerful instructional medium. It is only through understanding why teachers grow to behave towards digital technology in different ways, that teacher educators can teach them to use digital technology in useful and creative ways. This study found that a key variable in explaining differences in future teachers’ trust and intention to use digital technology is how positive their experience with digital technology has been in their teacher education program. In this new generation of technology-fluent preservice teachers, teacher educators now have a chance to promote effective use of technology in schools by modeling positive uses of pedagogically suited technology use in their own classes. 108 APPENDICES 109 APPENDIX A RISK LEVEL PILOT INSTRUMENT Directions: Imagine you are a teacher in the following situation. Based on the information given in each story, is this a high stakes situation or a low stakes situation? On a scale of 1-5, with one being high stakes and 5 being low stakes, what is at stake for you as a teacher and your students as leamers? A high stakes situation is one in which the outcome matters a great deal for both the students and the teacher. A low stakes situation is one in which the consequence of the action will not affect the student or teacher as much as in a high stakes situation. For example, see the two situations below: You have to use a textbook that is ten years old to teach students about Genetics. A lot has changed in the field of genetics in the last 10 years. How risky is this situation? You have to use a textbook that is ten years old to teach students about the addition. Not much has changed in textbooks about addition. How risky is this situation? Now think about what is at stake in each situation. Genetics is a new area in which new findings are constantly being made whereas addition is a fairly stable content area. There is more at stake in the first situation than in the second situation since there is a lot about Genetics that was not known 10 years ago, whereas most of what is known has already been written with regards to the Industrial revolution. Therefore, you would give the first situation a 1 or 2 (depending on how high you personally think the stakes are) and you would give the second situation and 4 or 5. Some situations are more difficult then others, so a textbox is provided for you to explain your answer. However, no text is necessary for you to continue ranking the level of stakes for each situation. Risky Situations: 1a. High (intemet): There are some controversial topics involved in discussions about the civil war. You want your students to learn about the civil war by going out and learning about the topic on their own. How risky is this situation? lb. Low (intemet): There are not many controversial topics involved in discussions about the trees. You want your students to learn about the trees by going out and learning about the topic on their own. How risky is this situation? 2a. High (intemet): Your students are learning about the evolution this year. You want your students to learn about the evolution by going out and learning about the topic on 110 their own. The students need to have this information for an upcoming standardized test. How risky is this situation? 2b. Low (intemet): Your students are learning about the minerals this year. You want your students to learn about the minerals by going out and learning about the topic on their own. This information will not be on any sort of standardized test, but you would like your student to learn about minerals. How risky is this situation? 3a. High (intemet): You think it is important to check students work for plagiarism. The next paper your students need to write is a letter to the local newspaper about a local environmental issue. This letter will ultimately go to the newspaper for publication. How risky is this situation? 3b. Low (intemet): You think it is important to check students work for plagiarism. The next paper your students need to write is a short one about bees and crops. You are the only one that will see the paper. How risky is this situation? 4a. High (e-mail): Your students have just written a big paper that they need to turn in before spring break. Your principal has encouraged that you to have students turn assignments in online using e-mail. How risky is this situation? 4b. Low (e—mail): Your students have just finished a short homework assignment. You are thinking about having student turn in assigmnents in online using e-mail. How risky is this situation. 6a. High (video/projector): In your class you would like to teach a lesson on fractions. Many things could happen that Friday to interrupt class. It is a Friday, and the students will be tested on the material the following Monday. How risky is this situation? 6b. Low (video/projector): In your class you would like to teach a lesson on fractions. Many things could happen that Friday to interrupt class. It is a Monday, and the students will not be tested on the material. How risky is this situation? 10a. High (web service): You are going to have students look up different topics related to their family Health history. Unbeknownst to many of your colleagues, the places the students will be getting information have been selling information about what students ask. How risky is this situation? 10b. Low (web service): You are going to have students look up different topics related to their adding and subtracting fractions. Unbeknownst to many of your colleagues, the places the students will be getting information have been selling information about what students ask. How risky is this situation? lll APPENDIX B PRESERVICE TEACHER TRUST IN DIGITAL TECHNOLOGY INSTRUMENT Directions Thank you for agreeing to participate in this study. This survey is part of research on trust in digital technology that is being conducted by Andrea Francis, MA and Punya Mishra, PhD. This survey is meant to assess how much you trust digital technology use in classroom settings. Digital technologies are those newer technologies that use digital pulses, signals, or values to represent data in computer graphics, telecommunications systems, and word processing (like the computer, the Internet, cell phones, etc.). This does NOT include things like chalkboards and pencils. Your participation is voluntary. You may decline to complete the survey or you may skip any item that you feel uncomfortable answering. The survey should take about sixty minutes to complete. All responses are anonymous. There are no correct or incorrect answers. The researchers are interested only in your frank opinion in order to determine the statistical relationships between the variables. Prior to beginning the survey you will need to read the consent form and give your consent. Your name and address is needed so that we may send you a prize if you win the raffle. ‘ If you have any questions, please feel free to e-mail Andrea Francis at andreapfrancis@gmail.com. Your time, insights, and perceptions are valuable resources. Thank you for sharing them! 112 Background and Interests Demographic Information: Drop down menu 1. Are you Male or Female? a. Male b. Female 2. What is your country of origin? United States China Japan Korea Taiwan Canada Other t is your age? Under 18 18 years 19 years 20 years 21 years 22 years 23 years 24 years 25 years 26 years 27 years 28 years . 29 years 30 years 0. Above 30 4. What is the highest level of education you have completed? a. High school . Bachelors Masters PhD Associates MBA . Other t is your ethnicity? White African American Asian Latino/Hispanic Native American f. Other 6. What is the highest level of education your mother has completed? scrape-99‘s» Z; .= a art-rate wrap-9.0:.» ”(refine-.00: 5.Wh {DP-99"!” 113 Elementary School Middle School High School Bachelors Masters PhD JD . MBA 7. What is the highest level of education your father has completed? Elementary School Middle School High School Bachelors Masters PhD JD MBA =qu rho 92-9 9‘!» storms-9.0:.» Teacher Profile 1. Are you an undergraduate or Masters Student? a. Undergraduate b. Masters 2. What level do you teach or plan to teach? 3. Elementary b. Secondary c. Post Secondary 3. What subject area do you feel MOST qualified to teach? a. Mathematics b. Physical Science c. Life Science d. Social Science e. Fine Arts f. English g. Applied Arts 4. What is your college year? a. Freshman b. Sophomore c. Junior d. Senior e. Other 5. How often do you plan to use computers in classes that you will eventually teach? a. Never b. Probably once a year c. Probably once a semester d. Probably once a month e. Daily 114 6. I plan to use the following when teaching classes in the future (Check all that apply): Microsoft WORD Microsoft EXCEL Microsoft PPT Microsoft ACCESS Internet Explorer Firefox Safari Adobe Photoshop Adobe Dreamweaver MS Front Page Adobe Photoshop Java Script . RSS feeds Web 2.0 Digital Cameras Digital Scanners Photocopy machines 9:6 .o P a arrears: rm .o-p .c'r» 115 Teacher Trust in Dig’tal Technology Vignettes (Vignettes were NOT numbered or labelled) Directions: Imagine you are a teacher in the following situation. Based on the information given in each story, answer the questions following each story. After you answer the two questions, you may elaborate on your decision. However you do not need to write anything in the text box. Please be honest with your answers. Competence 1a. High (intemet): You know that the intemet can provide students with a great deal of information. You would like your students to use the intemet to help them research the civil war. There are some controversial topics involved in discussions about the civil war. You want your students to understand how to use intemet sites and learn about the civil war. Do you to tell the students to use the intemet for their research? On a scale of 1-5, how likely are you to tell the students to use the intemet for their research? On a scale of 1-5 how much do you trust that these sites are going to have accurate information? 1b. Low (intemet): You know that the intemet can provide students with a great deal of information. You would like your students to use the intemet to help them research trees. You cannot think of any controversial topics involved in discussions about trees. You want your students to understand how to use intemet sites and learn about trees. Do you to tell the students to use the intemet for their research? On a scale of 1-5, how likely are you to tell the students to use the intemet for their research? On a scale of 1-5 how much do you trust that these sites are going to have accurate information? 2a. High (intemet): Your students are learning about the evolution this year. For one of their projects you can either use online Wikipedia as a source of information or you can use a set of Encyclopedia Britannica published in 1990 that the school has copies of. The students need to have this information for an upcoming standardized test. Do you tell the students to use Wikipedia for their research? On a scale of 1-5, how likely are you to tell the students to use Wikipedia for their research? On a scale of 1-5 how much do you trust that Wikipedia will have accurate information on this topic? 2b. Low (intemet): Your students are learning about the minerals this year. For one of their projects you can either use online Wikipedia as a source of information or you can use a set of Encyclopedia Britannica published in 1990 that the school has copies of. This information will not be on any sort of standardized test, but you would like your student to learn about minerals. Do you tell the students to use Wikipedia for their research? 116 On a scale of 1-5, how likely are you to tell the students to use Wikipedia for their research? On a scale of 1-5 how much do you trust that Wikipedia will have accurate information on this topic? 3a. High (intemet): Your principal recently encouraged teachers to use an online program for submitting papers. This online submission program claims to catch whether students have plagiarized something or not. The next paper your students need to write is a letter to the local newspaper about a local environmental issue. This letter will ultimately go to the newspaper for publication. Do you use the online submission program On a scale of 1-5, how likely are you to tell the students to use the online submission program for the assignment? On a scale of 1—5 how much do you trust that the online submission program will catch all types of plagiarism for this assignment? 3b. Low (intemet): Your principal recently encouraged teachers to use an online program for submitting papers. This online submission program claims to catch whether students have plagiarized something or not. The next paper your students need to write is a short one about bees and crops. Do you use the online submission program? On a scale of 1-5, how likely are you to tell the students to use the online submission program for the assignment? On a scale of 1-5 how much do you trust that the online submission program will catch all types of plagiarism for this assignment? Reliability 4a. High (e-mail): You teach in a school where all your students have access to the intemet. Your students have just written a big paper that they need to turn in before spring break. Your principal has encouraged that you to have students turn assignments in online using e-mail. Do you tell students to turn the paper in electronically? On a scale of 1-5, how likely are you to tell students to turn the paper in electronically? On a scale of 1-5, how much do you trust that e-mail will adequately send and receive all your students’ papers? 4b. Low (e-mail): You teach in a school where all your students have access to the intemet. Your students have just finished a short homework assignment. You can either have them turn it in on electronically or on paper. Do you tell students to turn the paper in electronically? On a scale of 1-5, how likely are you to tell students to turn the paper in electronically? On a scale of 1-5, how much do you trust that e-mail will adequately send and receive all your students’ papers? 117 6a. High (video/projector): In your class you can either give a lecture or use a DVD to teach a lesson on fractions. It is a Friday, and the students will be tested on the material the following Monday. Do you try to show the video as part of your lesson? On a scale of 1-5, how likely are you to show the DVD as part of your lesson? On a scale of 1-5, how much do you trust the DVD to work properly? 6b. Low (video/projector): In your class you can either give a lecture or use a DVD to teach a lesson on fractions. It is a Monday, and the students will not be tested on the material. Do you try to show the video as part of your lesson? On a scale of 1-5, how likely are you to try to show the DVD as part of your lesson? On a scale of 1-5, how much do you trust the DVD to work properly? Honesty 10a. High (web service): Your school has just signed up for a new web service. You are going to the computer lab this Friday for Health class, where students will use the intemet to look up different topics related to their family Health history. You find out right before you are going to the lab that this new web service, unbeknownst to many of your colleagues, has been selling information about what sites students visit to other companies. On a scale of 1-5, how likely are you to have your students use the intemet (web service) in health class? On a scale of 1-5, how much do you trust the designers of the web service to have your students’ best interests at heart? 10b. Low (web service): Your school has just signed up for a new web service. You are going to the computer lab this Friday for Math class, where students will use the intemet to look up and try to figure out the different rules for adding and subtracting fractions. You find out right before you are going to the lab that this new web service, unbeknownst to many of your colleagues, has been selling information about what sites students visit to other companies. On a scale of 1-5, how likely are you to have your students use the intemet (web service) in health class? On a scale of 1-5, how much do you trust the designers of the web service to have your students’ best interests at heart? 118 General Trust Survey (SVO) How Trusting Are You? For each of the five questions below, please check one of the five options given. Strongly Disagree Mildly Disagree Neutral Mildly Agree Strongly Agree 1 2 3 4 5 1 Most people tell a lie when they can benefit from 1 2 3 4 5 doing so. 2 Those devoted to unselfish causes are often 1 2 3 4 5 exploited by others. 3 Some people do not cooperate because they 1 2 3 4 5 pursue their own short-term self-interest. Thus, things that an be done well if people cooperate often fail because of these people. 4 Most people are basically honest. I 2 3 5 3 5 One should not trust others until one knows them well. 119 Attitude Towards Risk (RISK) Questionnaire Instructions Indicate using a 5 point scale the degree to which each of the following statements describes you. Click 1 to indicate it does not describe you at all (not like me) and click 5 if the description is a very good description of you (like me). Use remaining numbers to indicate the varying degrees that the statement is like you or not like you. Please read each statement carefully and then click the number that corresponds to your reply. Not Like Me Like Me 1 2 3 4 5 1 I like the feeling that comes with taking physical 2 3 4 5 risks 2 While I don’t deliberately seek out situations or 1 2 3 4 5 activities that society disapproves of, I find that I often end up doing things that society disapproves of. l 3 I often do things that I know my parents would 1 2 3 4 5 disapprove of 4 . . 1 3 5 I consrder myself a risk-taker Being afraid of doing something new often makes 2 3 5 it more fun in the end 6 The greater the risk the more fun the activity 1 2 3 5 I like to do things that almost paralyse me with 2 3 5 fear 8 I do not let the fact that something is considered 1 2 3 4 5 immoral stop me from doing it 9 I often think about doing things that I know my 1 2 3 4 5 friends would disapprove of 10 l 2 3 4 5 I often think about doing things that are illegal 120 Below are five key parts of what make up a trusting relationship. On a scale of 1 to 5 with 1 being most important and 5 being least important, how important is each of the following in your decision about who to trust. The other person must be reliable The other person must be competent The other person must be benevolent The other person must be honest The other person must be open 121 Computer Self-Efficacy Scale (Modified from The Murphy (1989) Computer Self-efficacy Scale) Please answer the following questions according to your feelings of confidence for successfully performing the specified task. Range is (1) very little confidence to (5) quite a lot of confidence. 1. Very Little Confidence 2. Little Confidence 3. Some Confidence 4. High Confidence 5. Quite a Lot of Confidence I feel confident working on a personal computer I feel confident getting the software up and running I feel confident entering and saving words or numbers into a file I feel confident escaping/exiting from a program I feel confident handling a disc correctly I feel confident choosing items from an onscreen menu I feel confident using a printer I feel confident burning a cd I feel confident copying a file I feel confident adding and deleting information to and from a file I feel confident moving the cursor around the monitor screen I feel confident using the computer to write a letter or essay I feel confident getting rid of files on a computer I feel confident organizing and managing files I feel confident using the user’s guide when help is needed I feel confident understanding terms/words relating to computer hardware I feel confident understanding terms/words relating to computer software I feel confident learning to use different computer programs I feel confident learning skills to use a computer program I feel confident using the computer to analyze number data I feel confident describing what computer hardware does (keyboard, monitor, disk drives, processing unit) I feel confident understanding the three stages of data processing: input, processing, output I feel confident getting help for problems when using a computer I feel confident explaining why a computer program does not work on a computer I feel confident using the computer to organize information I feel confident solving computer problems I feel confident logging onto a mainframe computer system I feel confident working on a mainframe computer system I feel confident logging off the mainframe computer system 122 Experience with Digital Technology in TE Program 1. In how many of your education classes here at MSU were computers used for purposes of instruction (including the use of ANGEL)? None P‘QQWPP-PP‘?’ exonerate—- More than 6 classes 2. On a scale of 1-5, with 1 being consistently reliable, 3 being ok, and 5 being the technology was always breaking, how would you rate the following digital technology use in classes that you have had in the education department here at MSU? 3. In general, with 1 being very positive and 5 being very negative, how positive has your experience been with the use of digital technology in the following settings? Fill in NA if the classes mentioned did not use digital technology in them. a. All classes taken at MSU 1). Classes taken in the Teacher Education Department here at MSU c. Classes taken in Erickson Hall Feel free to write any fruther comments. In order to receive your $2 in Sparty Cash, please type in your Student ID number below. Neither this number nor your e-mail address will be associated with your data during analysis. 123 APPENDIX C CONSENT FORM The ultimate goal of this research study is to advance the knowledge base concerning teacher education students’ understanding of and use of digital technology. Your participation in this study will give you a better understanding of digital technology use in classrooms and in your own beliefs about digital technology. If you are at least 18 years of age and choose to participate in this study, you will complete a survey, either online or on paper. The survey will be followed by a brief explanation of the purpose and procedures involved in this study. The entire study will take about 60 minutes to complete. If you complete the survey, you will receive $2 in Sparty Cash, which will be applied to your MSU Identification Card. You will also be entered into a raffle to win a $50 Amazon gift certificates. For every 100 people that complete the survey, one person will be chosen to receive the gift card. You have a l in 100 chance of winning the Amazon gift card.You will contacted by e-mail at all times. Your participation in this study is voluntary. Risks may include feelings of disappointment that you do not know what some of the technology terms mean. All results are confidential and we assure you that there is no right or wrong answer in this study. You may abstain from any part of this study and are free to discontinue participation at any time. The results of this study will be treated with strict confidence. Reports of the research findings will not cite specific responses or findings that might lead to the identification of any individual participants. Your privacy will be protected to the maximum extent allowable by law. Finally, your answers will be recorded and used for later data analysis. This information will solely be used for research purposes and reports of research findings from these results will not lead to the identification of any individual participants. All data will be kept in a secure location following the completion of the study. If you withdraw from the study, any record of your performance will be destroyed. By signing this consent form, you indicate your voluntary agreement to participate in this study. Thank you for your time. If you have any questions about this study, please contact one of the investigators, Andrea Francis by phone: (517) 432-9609, e-mail: ploucher@msu.edu, or by regular mail: 146 Erickson Hall, College of Counseling, Educational Psychology, and Special Education, MSU, East Lansing, MI 48824; Punya Mishra by phone (517) 353-7211, e-mail: punyar’érlmsuedu, 509A Erickson Hall, , College of Counseling, Educational Psychology, and Special Education, MSU, East Lansing, MI 48824. If you have questions or concerns about your rights as a research participant, please feel free to contact the Institutional Review Board Office at Michigan State University: (517) 355-2180, fax: (517) 432-4503, email: irb@msu.edu, or regular mail: 202 Olds Hall, East Lansing, MI 48824. You indicate your voluntary agreement to participate by completing or returning this survey or hitting the submit button. 124 APPENDIX D Dear TE student, My name is Andrea Francis and I am a graduate student in the College of Education. For my dissertation, I am looking at how much you all trust digital technology. I know this is a busy time of year for you, but I would really appreciate any help you can give me. In a couple days you will be receiving a request from me to complete a survey. If you decide to participate and complete the survey, I will give you $2 in Sparty cash and enter you in a raffle to win one of several $50 Amazon gift cards as a Thank You for your help. After you receive the official request from me, you can opt out of the study at any time. If you opt out of the study, you will not receive any more e-mails from me. Your name and e-mail will not be connected to your responses during analysis. If you have any questions or if anything is unclear, please feel free to write me. Have a good weekend, Andrea 125 APPENDIX E REQUEST FOR PARTICIPATION Dear *Frst Lst Name*, This is to request your consent to participate in a study being conducted in the College of Education at Michigan State University. Your participation is voluntary, you may choose not to participate at all, or you may refiise to answer certain questions or discontinue your participation at any point. Your privacy will be protected to the maximum extent allowable by law, and all of the information will be kept in a secure location and destroyed after five years. Your answers are confidential. We have provided an identification code so that we can keep track of who has responded but we will not connect individual names to the results in any way. You must be at least 18 years of age to participate. This research will result in knowledge regarding technology use in classrooms. This set of surveys will take about 60 minutes to complete. If you participate, you will be asked to complete a survey about risk levels of different situations involving technology use in the classroom. As an incentive for participation, if you do participate, you will not only receive $2 in Sparty Cash, but you will also be entered into a raffle to win a $50 gift certificate to Amazon.com. There is a 1 in 100 chance that you will win the Amazon gift card. You will be contacted by email after completion and given the information for redemption of the certificate. This research is being conducted by Andrea Francis for her Dissertation Project in the College of Educational Psychology, Counseling, and Special Education, at Michigan State University. If you have any questions about the study you can contact the study coordinator, Andrea Francis, online at ploucher@msu.edu, by phone at 517-432-9609, or at 146 Erikson Building, MSU, East Lansing MI, 48824. If you have any questions or concerns regarding your rights as a study participant, or are dissatisfied at any time with any aspect of this study, you may contact - anonymously, if you wish - Director of Human Protection Programs at Michigan State University by phone: (517)355-2180; fax 432-4503; or email: irbf@;msu.edu, or regular mail: 203 Olds Hall, East Lansing, MI, 48824. You may indicate your consent to participate by completing the questionnaire online. Next Steps: If you wish to participate in this research, please follow the link listed below. *website where it will be located* 126 APPENDIX F THANK-YOU/REMINDER Dear <@first_name@>, A while ago, you received an e-mail requesting your help by answering questions in an online survey. This is a follow-up request for your consent to participate in a study being conducted in the College of Education at Michigan State University. Your participation is voluntary, you may choose not to participate at all, or you may refuse to answer certain questions or discontinue your participation at any point. Your privacy will be protected to the maximum extent allowable by law, and all of the information will be kept in a secure location and destroyed after five years. Your answers are confidential. We have provided an identification code so that we can keep track of who has responded but we will not connect individual names to the results in any way. This research will result in knowledge regarding technology use in classrooms. This simple survey will take about 60 minutes to complete. If you participate, you will be asked to complete a survey about risk levels of different situations involving technology use in the classroom. If you complete the survey, you will receive $2 in Sparty Cash, which will be applied to your MSU Identification Card. You will also be entered into a raffle to win a $50 Amazon gift certificates. For every 100 people that complete the survey, one person will be chosen to receive the gift card. You have a 1 in 100 chance of winning the Amazon gift card.You will contacted by e-mail at all times. This research is being conducted by Andrea Francis for her Dissertation Project in the College of Educational Psychology, Counseling, and Special Education, at Michigan State University. If you have any questions about the study you can contact the study coordinator, Andrea Francis, online at ploucher@msu.edu, by phone at 517-432-9609, or at 146 Erikson Building, MSU, East Lansing MI, 48824. If you have any questions or ‘ concerns regarding your rights as a study participant, or are dissatisfied at any time with any aspect of this study, you may contact - anonymously, if you wish - Director of Human Protection Programs at Michigan State University by phone: (517) 355-2180; fax 432-4503; or email: irb@msu.edu, or regular mail: 203 Olds Hall, East Lansing, MI, 48824. You may indicate your consent to participate by completing the questionnaire online. Next Steps: If you wish to participate in this research, please follow the link listed below. 127 APPENDIX G LAST REQUEST Dear <@first_name@>, This is the last reminder I will send. I sent you an e-mail earlier this semester informing you that you would be getting a chance to receive $2 in Sparty cash and a chance to win one of several $50 Amazon gift card by answering questions in an online survey. This is the final follow-up request for your consent to participate in a study being conducted in the College of Education at Michigan State University. As you may have guessed, you have been chosen to get this e-mail because you are in the MSU teacher certification program. If you participate, you will be asked a serious of questions regarding your beliefs about trust, risk, and whether you would use a particular digital technology when teaching. The entire set of questions will take from 30 to 60 minutes to complete. I know you are busy right now and really appreciate any help you can give me. Your participation is voluntary, you may choose not to participate at all, or you may refuse to answer certain questions or discontinue your participation at any point. Your privacy will be protected to the maximum extent allowable by law, and all of the information will be kept in a secure location and destroyed after five years. Your answers are confidential and will not be connected to your identifying information during data analysis. This research is being conducted by Andrea Ploucher Francis (www.msu.edu/~ploucher) for her Dissertation Project in the College of Educational Psychology, Counseling, and Special Education, at Michigan State University. If you have any questions about the study you can contact the study coordinator, Andrea Francis, online at ploucher@msu.edu, by phone at 517-432-9609, or at 146 Erikson Building, MSU, East Lansing MI, 48824. If you have any questions or concerns regarding your rights as a study participant, or are dissatisfied at any time with any aspect of this study, you may contact - anonymously, if you wish - Director of Human Protection Programs at Michigan State University by phone: (517)355-2180; fax 432-4503; or email: irb@msu.edu, or regular mail: 203 Olds Hall, East Lansing, MI, 48824. You may indicate your consent to participate by completing the questionnaire online. Next Steps: If you wish to participate in this research, please follow the link listed below. 128 APPENDIX H REPORTED TRUST ITEM DESCRIPTIVE STATISTICS Table H. Reported Trust Item Descriptive Statistics Risk Mean SD Skew Kurtos is Min Max 1a. You know that the intemet can provide students with a great deal of information. You would like your students to use the intemet to help them research the civil war. There are some controversial topics involved in discussions about the civil war. You want your students to understand how to use intemet sites and learn about the civil war. Do you to tell the students to use the internet for their research? High 3.51 0.75 -0.933 -0.257 lb. You know that the internet can provide students with a great deal of information. You would like your students to use the intemet to help them research trees. You cannot think of any controversial topics involved in discussions about trees. You want your students to understand how to use intemet sites and learn about trees. Do you to tell the students to use the internet for their research? Low 3.81 0.59 -1.027 2.039 2a. Your students are learning about the evolution this year. For High 2.93 1.1 -O.294 -1.101 129 Table H Continued one of their projects you can either use online Wikipedia as a source of information or you can use a set of Encyclopedia Britannica published in 1990 that the school has copies of. The students need to have this information for an upcoming standardized test. Do you tell the students to use Wikipedia for their research? 2b. Your students are learning about the minerals this year. For one of their projects you can either use online Wikipedia as a source of information or you can use a set of Encyclopedia Britannica published in 1990 that the school has copies of. This information will not be on any sort of standardized test, but you would like your student to learn about minerals. Do you tell the students to use Wikipedia for their research? Low 3.01 1.092 -0.257 -0.943 3a. Your principal recently encouraged teachers to use an online program for submitting papers. This online submission program claims to catch whether students have plagiarized something or not. The next paper your students need to write is a letter to the local newspaper about a local High 3.51 0.861 -O.789 0.453 130 Table H Continued environmental issue. This letter will ultimately go to the newspaper for publication. Do you use the online submission rogram 3b. Your principal recently encouraged teachers to use an online program for submitting papers. This online submission program claims to catch whether students have plagiarized something or not. The next paper your students need to write is a short one about bees and crops. Do you use the online submission program? Low 3.63 0.748 -1.109 1.653 4a. You teach in a school where all your students have access to the intemet. Your students have just written a big paper that they need to turn in before spring break. Your principal has encouraged that you to have students turn assignments in online using e-mail. Do you tell students to turn the paper in electronically? High 3.79 , 0.969 -0.847 0.396 4b. You teach in a school where all your students have access to the intemet. Your students have just finished a short homework assignment. You can either have them turn it in on electronically or on paper. Do you tell students to turn the paper Low 3.73 1.014 -0.77 0.137 131 Table H Continued in electronically? 5a. In your class you can either give a lecture or use a DVD to teach a lesson on fractions. It is a Friday, and the students will be tested on the material the following Monday. Do you try to show the video as part of your lesson? High 3.89 0.916 -0.599 -0.083 5b. In your class you can either give a lecture or use a DVD to teach a lesson on fractions. It is a Monday, and the students will not be tested on the material. Do you try to show the video as part of your lesson? Low 3.98 0.839 1.148 6a. Your school has just signed up for a new web service. You are going to the computer lab this Friday for Health class, where students will use the intemet to look up different topics related to their family Health history. You find out right before you are going to the lab that this new web service, unbeknownst to many of your colleagues, has been selling information about what sites students visit to other companies High 2.1 1.081 0.807 -0.196 6b. Your school has just signed up for a new web service. You are going to the computer lab this Friday for Math class, where students will use the intemet to look up and try Low 2.08 0.959 0.502 -0.712 132 Table H Continued to figure out the different rules for adding and subtracting fractions. You find out right before you are going to the lab that this new web service, unbeknownst to many of your colleagues, has been selling information about what sites students visit to other companies Trust Total 0 39.96 5.248 -0.448 0.239 24 51 High Risk Items 0 19.72 2.791 -0.415 -0.082 12 26 Low Risk Items 0 20.24 2.659 -0.382 0.324 12 26 133 APPENDIX I REPORTED USAGE ITEM DESCRIPTIVE STATISTICS Table 1. Reported Usage Item Descriptive Statistics Risk Mean SD Skew Kurtos is Min la. You know that the intemet can provide students with a great deal of information. You would like your students to use the intemet to help them research the civil war. There are some controversial topics involved in discussions about the civil war. You want your students to understand how to use intemet sites and learn about the civil war. Do you to tell the students to use the intemet for their research? High 4.44 0.718 -1-.255 1.444 1b. You know that the intemet can provide students with a great deal of information. You would like your students to use the intemet to help them research trees. You cannot think of any controversial topics involved in discussions about trees. You want your students to understand how to use intemet sites and learn about trees. Do you to tell the students to use the intemet for their research? Low 4.62 0.572 -l.199 0.473 2a. Your students are learning about the High 2.91 1.358 -0.018 -1.327 134 Table 1 Continued evolution this year. For one of their projects you can either use online Wikipedia as a source of information or you can use a set of Encyclopedia Britannica published in 1990 that the school has copies of. The students need to have this information for an upcoming standardized test. Do you tell the students to use Wikipedia for their research? 2b. Your students are learning about the minerals this year. For one of their projects you can either use online Wikipedia as a source of information or you can use a set of Encyclopedia Britannica published in 1990 that the school has copies of. This information will not be on any sort of standardized test, but you would like your student to learn about minerals. Do you tell the students to use Wikipedia for their research? Low 2.96 1.33 0.068 -1.226 3a. Your principal recently encouraged teachers to use an online program for submitting papers. This online submission program claims to catch whether students have plagiarized something or not. The next papeLyom students High 3.97 l .108 -1.034 0.313 135 Table I Continued need to write is a letter to the local newspaper about a local environmental issue. This letter will ultimately go to the newspaper for publication. Do you use the online submission program? 3b. Your principal recently encouraged teachers to use an online program for submitting papers. This online submission program claims to catch whether students have plagiarized something or not. The next paper your students need to write is a short one about bees and crops. Do you use the online submission mogram? Low 4.04 0.938 -1.l66 1.42 4a. You teach in a school where all your students have access to the intemet. Your students have just written a big paper that they need to turn in before spring break. Your principal has encouraged that you to have students turn assignments in online using e-mail. Do you tell students to turn the paper in electronically? High 3.9 1.255 -0.99 -0. 152 4b. You teach in a school where all your students have access to the intemet. Your students have just finished a short homework assignment. Low 3.32 1.27 -0.213 -l .244 136 Table 1 Continued You can either have them turn it in on electronically or on paper. Do you tell students to turn the paper in electronically? 5a. In your class you can either give a lecture or use a DVD to teach a lesson on fractions. It is a Friday, and the students will be tested on the material the following Monday. Do you try to show the video as part of your lesson? High 3.33 1.199 -0.299 -l.08 5b. In your class you can either give a lecture or use a DVD to teach a lesson on fractions. It is a Monday, and the students will not be tested on the material. Do you try to show the video as part of your lesson? Low 3.68 1.178 -0.878 -0.l67 6a. Your school has just signed up for a new web service. You are going to the computer lab this Friday for Health class, where students will use the intemet to look up different topics related to their family Health history. You find out right before you are going to the lab that this new web service, unbeknownst to many of your colleagues, has been selling information about what sites students visit to other companies. High 2.43 1.221 0.471 -0.896 6b. Your school has just signed up for a new web Low 2.57 1.251 0.257 -1.186 137 Table 1 Continued service. You are going to the computer lab this Friday for Math class, where students will use the intemet to look up and try to figure out the different rules for adding and subtracting fractions. You find out right before you are going to the lab that this new web service, unbeknownst to many of your colleagues, has been selling information about what sites students visit to other companies. Action 42.18 6.229 -0.507 0.762 24 56 High Items 20.98 3.343 -0.469 0.17 12 28 Low Items 21.2 3.392 -0.442 1.058 9 29 138 APPENDIX J OPEN-ENDED COMMENTS After each item, participants were able to write any comments they felt might be important. Below are each of the items and the comments that were written. The number indicates the participant number. You know that the internet can provide students with a great deal of information. You would like your students to use the internet to help them research the civil war. There are some controversial topics involved in discussions about the civil war. You want your students to understand how to use internet sites and learn about the civil war. Do you to tell the students to use the internet for their research? 75 A discussion (or several) needs to take place about how accurate intemet information can be (or not be)--and what indicators people can use to help judge the source on reliability. History is also burdened with issues of perspective, so even books can be unreliable. 87 Although there are some websites providing false information, if you limit sites you don't want your students to use, such as Wikipedia, the intemet is a fantastic resource and is more likely to get kids involved than a book. 33 An important thing when teaching research in a social science class is to teach students how to critique a source. Who is the author? Who do they work for? When was is written, and what sources do THEY use? I would use the intemet sources as a way to illustrate the need to investigate their findings. 44 As long as the students are aware of educational sites such as .edu, .org, or .gov I don't have a problem with using the Internet as a source. 28 As long as the students know what they're searching for, and use multiple websites to confirm it, the information will probably be accurate. Knowing if the sources are reliable will also help. 56 Depends on age level, and I would give guidelines about where the information needed to be from. 53 i feel it is important to stress to the students that some sources may not be reliable, but it is still important for them to be able to do research on the intemet. If there are controversial topics, both sides should be researched. 135 I feel that some of the sites could have good information, and I am probably somewhere between somewhat likely and very likely. I feel that both are valuable. For instance I would tell them not to trust/be skeptical of an individual's Civil War page ex. "Uncle Toms Civil War Page" but to trust pages that are made by institutions or professors. I feel though that the individual's page also could be valuable to show students how to tell if they should be trusting of a page or not. If used under the right circumstances with teacher supervision/direction it could be a good experience and work out with whatever sources they use. 43 I may not trust that all the sites are accurate, but I think it important to educate the students about this risk and teach them to be conscientious in their intemet research. 139 15 I only completely trust website that are ".edu" websites, but understand that it could be biased information 48 I think that a lot of material on the intemet is untrustworthy and often biased by people's opinions. As long as you can find a scholarly website, that will be more trustworthy and accurate. 77 1 think that if the students are taught how to find reliable information, they will find credible sources 20 I think the intemet can be a really useful tool but it can't replace texts 78 I want to teach them to be critical of what they come across on the intemet regarding such controversial topics. I want to teach them that all sources, especially the intemet should be examined for their biases and cross-referenced for their accuracy. I'd advise them to limit their research to credible historical websites. 47 I will have several websites that provide good information that students will be able to learn and trust, without going on the intemet and finding inappropriate information and sites. 12 I will try to teach students where to find reputable sources. 14 I would also try to guide them in finding reliable sites, and teach them about finding quality sources. 86 I would definitely have to go over what "respectable sources" are. 32 I would definitely make sure to give them examples of reputable websites. 38 I would direct them to sites that I have verified as useful. 108 I would encourage my students to use the intemet. However I would also discuss the topics and what they found in class to help them see what is real] and reliable and what is not. 84 I would give them specific sites to go to that I have researched prior to me giving the assignment. 30 I would have my students make sure to only use scholarly web sites and not stuff like Wikipedia. I would make sure they use the intemet for facts only and not opinions of the website writers. 19 I would make sure that students understand that not all websites have accurate information and explain to them what type of websites (journals, etc) are appropriate for research. 70 I would most likely research sites that have accurate information for my students to refer to beforehand, and provide them with a list of these sites before they start their project. 117 I would not just say "use the internet!" There would be guidance, websites that are scholarly journals, sites that information is trusted on. Students have to be taught how to access these sites, they will not just stumble upon them on their own. 111 I would probably specify which types of sites students should look at for reliable information and explain that everything on the intemet is not always accurate. 74 I would probably teach them to look at many different sites before believing information as true 123 I would put emphasis on the fact that not all sources are credible. 93 I would speak with students about filtering good information from bad and double checking sources. 52 "I would specify the types of sites to be used, excluding Wikipedia, ect. 140 I would also require a primary or secondary source in book form, to ensure the students know how to get information from that as well." 66 I would want my students to use the intemet for research, but I would reinforce to them constantly that anyone can put anything on the intemet, so they need to question any facts, and check them against other sources as well before they believe anything. But it is important for students to know how to use the intemet properly for research . 68 If the goal is to get students to learn how to use web pages I think they should be told. However, if you want them to find a way of learning the information they can be given the opportunity to decide on the sources that they want to use. 25 Internet sources, like (and even more so than) written sources, must be carefully scrutinized and properly cited to ensure factual information is being presented. 58 It is important to help students understand what kinds of websites provide credible information like . gov sites and websites of academic institutions. It is also important to set boundaries about what is acceptable to include in a paper (the official website of the KKK isn't the best source of information for anything). The intemet is a valuable tool for learning, and I don't think that we should avoid it just because there is a risk of students coming across something controversial. 26 I've been taught that even some .org or .edu sites have incorrect information, but then so do books, no no source can have 100% correct information 42 Just because it is controversial does not mean they should not find all the sources they possibly can about a topic. However, I plan to tech in a way that raises skepticism and critical thinking, so that students can analyze everything they read for truths, bias, and accuracy. ~ 64 Make sure sources are reliable 40 Make sure they use credible websites and also require them to have so many books for their bibliography. 95 many times they will end up at Wikipedia for their information, so i would need to specify which trypes of websites are appropriate for the research they will be doing. 1 no 119 none 133 none 23 Obviously, we want to encourage students to get the most up-to-date information, but knowing that some sites are based on opinions and not factual data, we must use precaution when using the intemet to support claims or refute them. 88 Part of the education process ought to include becoming aware that all sources need to be subject to verification. What a golden opportunity to demonstrate this as well as learn research skills. 99 Provide students with a list of websites that they can or cannot visit. 96 some websites provide information that has been checked for accuracy, others can be edited by anyone. It's important to know the difference when using them as resources. 2 Students need to be told how to find a website with verifiable and trustworthy information. 35 Teach which sites can be trusted 61 The intemet is a valuable resource if students are taught how to research and find beneficial sites to gather information from.‘ 79 The reliability of sources varies and it is important to convey this to students 141 51 The sites students use have to be creditable and they need to be able to proof they are so Wikipedia is excluded from resources to be used in my classroom. 129 The sites would have to have adequate sources, which I would explain to the students. 89 there are some sites out there that are full of valuable and accurate information but there are many that anyone can just say what they want about whatever the topic is. You need to discuss how to choose credible sources with your students before they go out and do their research. 134 There is much information about the civil war and it is an even in history. I believe that just about all the information placed on a site will be accurate to some extent. 104 There will definitely have to be classroom time spent, teaching student what is a reliable source. 85 This becomes another teachable moment on what perceptions exist about the war and/or how the intemet is not always the most accurate source for information. Leading to the conclusion, that explains why some colleges have difficulty allowing it as a credible source. 27 Whether or not I trust the information depends on the website. 98 Whether or not I trust the sites depends entirely upon what the sites are. If they're somebody's expage or blog, obviously I wouldn't trust them at all unless they cite a reputable source. If the site itself is a reputable source, I would place a great deal of trust in it. 112 With so many contributors to the intemet out there, it is hard to know who is providing accurate information. 60 would have them cite their sources 128 you can trust information from certain websites, prorper research technique must be observed 5 you could always give your students a list of websites that they could use You know that the internet can provide students with a great deal of information. You would like your students to use the internet to help them research trees. You cannot think of any controversial topics involved in discussions about trees. You want your students to understand how to use internet sites and learn about trees. Do you to tell the students to use the internet for their research? 33 Again, an integral part of research is to learn how to gauge the authenticity for yourself, and so this can be an exercise in that skill. 88 Again, everyone has an agenda, even people writing about trees. Get two or three issues with one exercise. 56 Again, I would emphasize the source of the information. Just like you don't trust everything random people say, you don't trust random websites. 117 Again, not simply opening the entire intemet for them. Directed researching. 48 Again, the intemet is full of sites with inaccurate information, but as long as I teach the children how to look for trustworthy sites, it should be fine for research. 75 Because facts about trees are more objective than history, it is easier to test the facts against other sources known to be reliable. 142 70 First of all, I wouldn't assign my students a project that I didn't understand completely myself. I would go online to research some controversial topics on trees, and then refer my students to accurate websites. 51 I still believe that the sites my students use need to be creditable since random Joe with intemet access can make a website about trees with completely false information. 3 I want to be sure to let me students know before hand what sites qualify as good trustwothy sites and what sites they should stay away from. 98 I will always tell students to use the intemet for research, it's a massive resource and it's a shame not to use it. 74 I would also tell the students to use many websites before believing information is true 108 I would comment the same as I did before. 87 Just because the issue isn't controversial doesn't mean their might not be some false information about it, but the intemet is still a great resource. 47 Just like the previous comments. 40 Make sure they use credible websites and also require them to have so many books for their bibliography. l . no 133 none , 27 Once again, accuracy depends on the website 129 Once again, the information from the intemet must have adequate sources. Maybe I could show them some sources that would be more helpfirl in finding correct information. ‘ 128 same as 17 32 Same as above. 30 Same cements as above. 53 since there are fewer controversial topics, they would probably be less likely to find contrasting information 85 The only other issue here is plagiarism. 23 There still is a good deal of information that could be hazardous regardless of the appearance of controversy. Using the intemet is dangerous and helpful, and students need to be walked through. 86 This differs from question 16 because I feel that earth science is much less biased than social science. 89 This is an easier topic to find credible information on and less junk! 78 Though there is less controversy here, it is very possible that many sites can be inaccurate. I'd want them to look at more credible sites for their research to ensure that more of their research would be accurate. 28 Trees is a very broad topic, but the stipulations from the topic above remain the same. 60 would have them cite their sources Your students are learning about the evolution this year. For one of their projects you can either use online W ikipedia as a source of information or you can use a set of Encyclopedia Britannica published in 1990 that the school has copies of The students 143 need to have this information for an upcoming standardized test. Do you tell the students to use W ikipedia for their research? 33 A study in Nature magazine found that, in natural science topics, Encyclopedia Britannica had MORE factual errors than Wikipedia. Wikipedia is constantly monitored and often edited by very educated people who cite their sources. 109 Anyone can enter information on that website 87 As long as I were to go over the information they found in class to make sure it was all accurate, I would be comfortable with my students using Wikipedia because a lot of students might have trouble with the language used in encyclopedias. 117 can't we use both? evolution is a broad enough topic that Wikipedia can be used to provide SOME GENERAL information. Depending on the detail the students are required to know, other sources may be used. Not sure about a 20 year old encyclopedia though. That may not be the best "other" source 104 Children need to know that wikis can be edited by anyone. They are not reliable websites. 96 Edited by anyone and not always accurate. Awful source! 126 For Elementary students, Wikipedia will probably have enough accurate information for them. If I were to teach a high school science class I would defintely not allow them to use Wikipedia as a source. 47 I am torn. Wikipedia is not a great source because anybody is able to write something on this site, but on the other hand the Encyclopedia is really outdated, coming from a 1990 edition. What would you do? 88 I feel like a broken record. Wikipedia is not as sloppy as it once was and not just anyone can write just anything they want. If nothing else, use Wikipedia to find the sources listed and look there, too. 107 I feel that now Wikipedia is pretty accurate. I would rather have them use Wikipedia than '90 encyclopedia's because I'm sure there is much more information on evolution now than almost twenty years ago. 112 I find Wikipedia has done a better job of verifying sources and information. I've used it recently while preparing for a History exam here at MSU and had no points marked off for the definitions that I provided. 30 I know that Wikipedia can be edited and is not always accurate or detailed. 51 I know Wikipedia is checked out by editors every so often but I can edit its content to say what I want and I am not sure how long it will stay up there for people/students to quote. I personally don't believe its a great resource but it is a good tool to guide students to better resources with the links on the bottom and maybe as an introduction to the topic. 56 I like Wikipedia, especially when it is well cited. 44 I personally sometimes use Wikipedia as a quick summary of things then use more accurate sites to give me the proper information. 15 I will inform students that they can read about topics on Wikipedia if they would like background information on that topic, but they should always be aware that that information could be biased or incorrect, so never to use that information in a paper. 19 I would make sure to read the Wikipedia website first 42 I would pre-screen the site before I told them to use it for information accuracy. 144 14 I would probably ask them to check both, and also check the sources of the Wikipedia page. 32 I would request that they check the bibliography listed at the bottom of the pages they use. 93 I would talk to students about the accurateness of Wikipedia and have them go through steps to check sources. They are likely to use Wikipedia anyway, so I want to teach them to use it intelligently. 70 I would tell my students to check the citations on Wikipedia to see if they are a reliable source. 27 I would tell them to look on Wikipedia but to then check out the sources listed at the bottom of the article to check. 100 I wouldn't trust Wikipedia to give accurate info, but I would suggest it as a starting point 66 I wouldn't want my students to think that Wikipedia is a scholarly source, as anyone can go on the site and add or change what they want so I would not encourage the use of it. 74 In the real world, Wikipedia is a very used and fairly trusted source. I would tell them to use other sources as well. 61 In this situation I would have the students use both resources and compare their findings. 128 Information has to be checked, multiple sources, checking the reasoning behind statments. 43 It's a bit difficult to choose between an outdated source and an open source, but I would rather the students use their heads and glean important information that is routinely corrected and updated online, and take it with a pinch of salt. In addition, standardised test questions are unlikely to pose fact-based questions about controversial issues surrounding evolution. 64 It's a controversial topic in Wiki so some of the information might be biased 78 It's been shown that Encyclopedia Britannica has more factual errors than Wikipedia. While the encyclopedia is a definitive article on a topic, Wikipedia is a constantly evolving document that takes into account collective knowledge of society. Wikipedia is constantly being monitored for errors and biases, and often notes when such inaccuracies occur. I'd still ask them to be critical and reference the sources cited in the article to understand the legitimacy of the arguments presented. 113 Just make sure the Wikipedia page has credible sources. 77 Since anyone can add to a Wikipedia content site, I'm not very trusting of it. 53 since anyone can edit Wikipedia info, it is not reliable 67 Some Wikipedia articles have accurate information and links to the sources that the information came from, so I wouldn't mind using it as a starting place. I would probably encourage looking in the Eneycolpedia, as well as other better sources, to get a more accurate perspective. 89 Supposedly there are people working for Wiki that check the information hourly and within 24hours of posting on high traffic subjects the inaccurate info is taken down, within a couple days for lower traffic topics. I think it good for students to see different views on subjects like this and it can spark great discussion in class. 145 2 Teachers need to be cautious with Wikipedia. The creators are doing a better job to make sure the information on Wikipedia is sourced but depending on the article misinformation can easily be placed. 58 The great part about Wikipedia is that it often has links to other sites, so it is possible for students to use it as a starting point for more in depth research. 135 The problem that I have here is that in the past 20 years there has been a lot of study and some new important facts could have come about that the Encyclopedia's do not have in them because they are so old. I would tell the students to read Wikipedia WITH THE ENCLYODPEDIA'S information already in mind. I trust Wikipedia to give a good overview of a topic, however I would never trust all of the information that appears on the page. This is because it can be added by anyone, eventhough it is supposed to be cited. I do however, trust Wikipedia for that exact same reason as that it is supposed to be cited, AND it can be peer reviewed. What makes that any different from an article that is about to be published academically? 134 Too much information can be changed on this site. For standerized tests, the students will need accurate information 98 When telling students to use Wikipedia, I would be sure to tell them that Wikipedia can be a very valuable resource IF it cites its source. It may be valuable to do a Wikipedia project at the beginning of the year where we find places where sources are cited and information is reliable, and where a source is left out and information is questionable or clearly flawed. 99 Wikipedia can have some information that is true, but I would have the students double check what they find with the Encyclopedia. 23 Wikipedia has several pages that are locked by a council of administrators, and Evolution is one of those pages. If the statement on the page has a cited source, click on the source and make sure it's a legitimate source. I would tell them to use the Encyclopedia first, only because then they can cross reference that dated information with what they have found on Wikipedia. They can come to their own conclusions from there. 75 Wikipedia is a source that has a screening process (people are hired to update the information and check for sources) and requires that information not backed by sources be indicated as such (and not fact). However, the information can be changed and it would take a lot of time to check the actual sources for accuracy. Evolution, like many topics in science, is an ever-changing world. However, there are many years of research behind it and 1990 would not be so out of date. If there was a topic that had been stumbled upon more recently (like computer technology), we would have to be very current with our information (to keep up with the evolution--ha--of that topic). Ideally, however, it would be best to have the most current information about evolution. 28 Wikipedia is comprised of data from many sources and people can keep adding to it. It's hard to tell if it's accurate or not. Usually, it is believable, but I'd rather have my students gain their knowledge from experts. But because it is a science topic and science changes all the time, it will probably help to get a more modern view of it, rather than one that is 20 years old. 131 Wikipedia is good for quick info - but i wouldnt let my students use it as a credible source to cite statistics from 146 52 Wikipedia is monitored and revised. So in a contraversial topic such as evolution it would be more likely that the facts are somewhat accurate or in the very list represent some sort of opinion. It would not be my first choice of a source however. 40 Wikipedia is not a credible source. 48 Wikipedia is not always accurate. Some of the information is correct, but it is difficult to know by looking at it what to believe and what not to believe. 45 Wikipedia is not always factual - anyone can write an article. I would probably urge my students to use Wikipedia as a guide to get started, but to find more reputable sites for more concrete research. 86 Wikipedia is pretty good about enforcing verification for controversial topics. 25 Wikipedia, on average, has about as many errors as any given encyclopedia, simply due to the aging of information. It must be carefully examined, but many of the factual inaccuracies are quite easy to spot. 3 You can add any information you want to Wikipedia. It does have a lot of information, but how much of that information is accurate is slim. 1 you can never really trust Wikipedia, for general information you can trust it, but not for details Your students are learning about the minerals this year. For one of their projects you can either use online Wikipedia as a source of information or you can use a set of Encyclopedia Britannica published in 1990 that the school has copies of This information will not be on any sort of standardized test, but you would like your student to learn about minerals. Do you tell the students to use W ikipedia for their research? 78 *See answer 23. 26 a lot has been learned in the last 19 years that would probably not be in the encylapedias so the students would probably get more reliable information from Wikipedia 56 Again, especially trust it if it has quality sources. 75 Again, minerals are less controversial and more objective. Evolution is still a "theory" whereas much of our information about minerals can more easily be checked against valid sources. However, I prefer to steer my students away from relying on Wikipedia for completely valid information. 117 Again, Wikipedia is not research based but can provide general information that can help direct researchers into more specific information they may be looking for and can find greater, more factual research in something such as an encyclopedia. 87 As long as I went over the information in class to correct any errors, I wouldn't mind my students using Wikipedia 107 Because the information about minerals has probably stayed the same over the past twenty years, I would probably want them to use an encyclopedia so that they have experience physically getting information from a book. 33 I don't know anything about minerals, but I bet Wikipedia could tell me. 53 i feel like it could have accurate info, but even though they wont be tested, it is scary to have them research, learn, and obtain something that is not true. 28 I have the same answer as before. 147 52 I was told to use Wikipedia for one of my geology classes and the information was revised by one of the teachers,so i would be likely to advise this site for last chance revision. 98 I would also be sure to tell the students that the Encyclopedia Britannica set is available as well. 27 I would likely recommend that they read both but if they read Wikipedia, they should verify the article by looking at the sources. 3 I would supplement any missing or new information missing on minerals or evolution thorugh class discussion if need be. 74 I would tell them to use other sources as well 88 Just because something is written in a book, or a print encyclopedia for that matter, doesn't mean it is whole factual. I would encourage my students to compare the information between available electronic resources, Wikipedia iincluded, and take the opportunity to crosscheck against the print encyclopedia, too. Learning ought to be about a whole lot more than just finding facts and memorizing them. How about teaching students to be critical consumers? The ability to intelligently question the authorities and the accepted wisdom is very valuable. 113 Just make sure the Wikipedia page has credible sources. 43 Minerals are not very controversial; I think it likely that the information will be pretty accurate. 48 Minerals are things that don't really change much. The information should be about the same from both sources and I know for sure that the encyclopedias are accurate and that the authors are trustworthy. ‘ 61 Minerals do not change very much over time so Wikipedia wouldnt provide any better information then an encyclopedia would. 67 Not that I don't trust what would be on Wikipedia, but knowledge about minerals probably hasn't changed that much since 1990 so the encyclopedia would be just as good of a source and more legitimate. 2 Refer to 23 32 Same as above. 23 Same as above. 128 Same deal, facts need to be checked from Wikipedia, but it is a good starting point. Information in the britannica is old, and outdated. 19 same thing, i would be sure to read the Wikipedia website first 77 See #23 comments 70 See above. 134 The students can use it to gain a sense of what minerals are. The whole truth might be on the site but most of it would be. 93 They are likely to use Wikipedia anyway, so I want to teach them to use it intelligently. 40 Wikipedia is not a credible source. Your principal recently encouraged teachers to use an online program for submitting papers. This online submission program claims to catch whether students have plagiarized something or not. The next paper your students need to write is a letter to the 148 local newspaper about a local environmental issue. This letter will ultimately go to the newspaper for publication. Do you use the online submission program ? 56 At the very least, it will just scare the kids and please the principal. So win-win as far as I can see. 75 Because the assignment has such a local focus, I do not have much fear that the students will try to copy a paper off of a website. The principal has encouraged this and I will most likely comply (unless I feel that it is superfluous or not efficient). In this case, it seems harmless. I feel that it is easier to catch copied papers than not, because teachers can search specific lines and conjure the original up. Why shouldn't a program be able to do the same? 78 Due to the nature of this assignment, being opinion based, it would be much harder for a student to plagiarize. And if they did so, it would be to their own detriment as it probably will not be published. 25 Due to the sheer amount of published works not available in a digital format, it is impossible for any online program without an endowment of billions of dollars to have access to enough material to catch even some glaring examples of plagarism. 58 I am a little worried that there might be a false-positive error, but I would handle all positives on a case by case basis. 51 I am not familiar with the program so I am not 100% sure. 104 I do not think I will have many problems with plagiarism in elementary but it is more environmentally friendly to submit papers online. 33 I don't like this at all. I would only have them do this if I were absolutely required to, and in that case I would still publicly oppose the policy. What a way to treat students! I think it is usually very obvious when someone's work is plagiarized, and to integrate plagiarized material so well into a paper that a computer would be required to check the validity would require so much work that it would be easier to just write and cite. Besides, if they plagiarize, they wouldn't have a citation, and that in itself would be a problem! I think that this crossed the line, and I won't do it. 44 I have never used a program to catch plagiarism but I know many teacher who use it. 70 I have never used such a program and don't know how reliable they are. 88 I haven't answered #28 because I have had no personal exposure with these types of programs. Once I have used them and I know more about them, then I could answer this question. I will say for the record though that I will not park my brain and accept the programs verdict. I will be very skeptical when the program judges a paper as plagerized. In essence, it will bear the burden of proof, not my students. 53 i haven't heard of such a program before 93 I might use the program if I am suspicious, after reading a student's paper, that he/she has plagiarized. I would not if I weren't suspicious. The program seems to be a "vigilante" system to me. This paper is more of a student voice paper, which is more obvious if plagiarized. 86 I said 4 because there is ALWAYS a chance of error. 68 I think it could be helpful with papers however, I don't think that it would be able to get all plagiarized papers and it can't catch the problem you run into with younger grades; parents doing the papers for the student. 149 28 I will still probably check the plagiarism for myself, but I like the idea of an online submission because it saves trees. Go Green! 129 I would like to think that I would trust my students enough to not plagiarize. You can usually tell if something is written that is not coming from the student. 52 I would revise it as well. 117 If a student plagiarizes and is published in a local medium, the consequences may be more dire than if they plagiarized a regular paper. I would want to do all I could to make sure students were acting wisely. 38 If I sensed a student's writing style was different from their normal style, I would check with the software. 96 If it's going to be published it certainly shouldn't be plagiarized! 107 In my experience, these websites catch popular phrases as plagiarism. I would use it if I could look at it to determine whether it was accurate or not. 128 It is possible it could accuse kids of plagiarism without it being true. Would have to understand how the program works, how it determins plagiarism 43 It will simply be another tool in the process of ensuring academic honesty. I wouldn't make any final decisions based on the program alone. 89 I've never seen or used a program like this so i don't really know how it works. 14 Plagiarism is an important issue, and a site that filters through papers seems like a good resource to have, but it can't catch everything. That said, it's better than nothing. 135 The problem I have here is that it seems that this assignment is a value judgement. This is not totally a research paper. There will be facts included either way that might seem like they were plagerized, plus it is not like they are seemingly being asked to cite sources to the information they used in their letter to the editor. 40 The program is more likely to catch plagiarism. 98 The program will never be able to tell if another student wrote a unique paper for the student who turned it in. 23 Wary of how technology can be manipulated, but I've seen examples of programs doing a fine job catching plagiarism. Your principal recently encouraged teachers to use an online program for submitting papers. This online submission program claims to catch whether students have plagiarized something or not. The next paper your students need to write is a short one about bees and crops. Do you use the online submission program? 53 again, i haven't heard of such a program and it seems like for this topic, there would be much information on the web that is similar and could be mistaken as plagiarisim 88 Again, I really don't think, unless the program has access to some body of each student's personal work, that I will trust it very far at all. 33 As I said before. 89 Depending on the grade level and whether I want them to improve their handwriting or typing skills. 93 I am more likely to use it in this situation because plagiarism would be less easily recognized by the teacher (research voiced v. student voiced). 26 I don't think it's possible to catch all types of plagerism 150 86 I feel like the short length of the paper would cause false reports. 117 I would use it only if I suspected a student was actually plagiarizing. 75 If the online submission program is accessible to students, why not have them submit the paper this way? If some students do not have access to intemet, I may think more on this matter, but the library is always open to use also. 78 I've never used these programs, so I'm not sure how well they work. Students are often so desperate to avoid doing their work, they will put in tremendous amounts of effort to plagiarize creatively. I speak from experience as a high school student. I'm not sure how well these programs work to detect this problem. I also really dislike the idea of distrusting our students so much, almost thinking of them as sneaky and untrustworthy. 85 Most plagiarism is blatant, but occasionally the focus of the project is so narrow that duplication of information and ideas/theories will occur. 129 Once again, I would prefer to trust my students rather than make the assumption that they plagiarizing. 128 Same as 29 14 Same comments as above. 70 See above. 23 Students should be able to produce their own work. Also same as above. 77 Students should be very careful not to plagiarize. Having them submit their paper to a program like that, whether or not it works, will most likely help them to realize the severity of the issue. 87 Whether or not it will be published, plagiarism is a serious problem and since it will be my job as a high school teacher to prepare my students for the world, they need to be prepared for the consequences that come from plagiarism. You teach in a school where all your students have access to the internet. Your students have just written a big paper that they need to turn in before spring break Your principal has encouraged that you to have students turn assignments in online using e-mail. Do you tell students to turn the paper in electronically? 89 although I trust email things can/have get/gotten lost. 87 Although I would like to save the paper needed to print all those essays, reading 30-some essays on a computer would be very hard on my eyes. 68 Because this is a common practice in college students need to learn to submit papers via email by a deadline. However, I would also have them hand in a paper copy to ensure that I receive them in time. 70 Email can be unreliable sometimes, even if every student has intemet access. I would rather have a paper copy of a paper. 28 E-mail seems to be very reliable, but I will offer the students to turn in a hard copy if they wish. 78 First, I don't like electronic submission for papers. It makes me as the teacher either have to grade it online (which is difficult) or print it out myself (which is costly). Secondly, excuses for an e-mail "not sending" will allow students to turn in the paper late will little penalty since it is hard to verify. 23 Had issues with Angel- Compatibility issues should be resolved before due date. Talk to the students and walk them through or do a test email first. 151 96 Hard copies are better. Then there's no questions about it. 88 I have already used this system, dedicating an email to specific classes. An autoresponder that harvests specific information and then sends that back to the student will let them know if I received their paper or it got lost somewhere. 3 I have sent things in through email and they have gotten "lost" somehow or my teacher has not recieved them. I would prefer a hard copy of the paper especially if the students are going to be gone on Spring Break. 43 I know from experience that sometimes technical difficulties arise, and that not all students have intemet or computer access as easily available as other students. I would say that email would be an optional method for turning the papers in. 48 I know that sometimes students may have trouble sending emails and I personally have had a lot of technology issues of this sort. I would want the students to all try to send me the paper electronically, but if that did not work out for some students I would tell them to bring in a hard copy. 47 I know that technology is still a work in progress and there are several students that do not have access to this option, so there would be chances to turn in the paper by hand, and as always with documentation if something was not sent through correctly the students would have an opportunity to get the paper in some other way. 42 I plan to accept papers anyway that a student would like to give it to me. As long as I have it in my hand by the due date, no matter what form I will accept something. Not everyone has access to the same resources. 53 i rely on the intemet and email a lot and although there are occasional problems, it is easy and fast, and a good way to be green. 13 I think it is important to hand in hard copies of the paper. I do not object to email submission, however, I know as a student I have less anxiety about a paper if I know that I have give my professor a hard copy. 14 I trust email as a resource to collect papers. There can be technical difficulties sometimes, but those can be worked through in a timely manner. 98 I would also provide students the opportunity to turn in a hard copy of the paper if they have any issues with emailing it. 7 I would have them hand in a hard copy and send in a digital copy. 64 I would make sure it works before the deadline. 27 I would prefer a hard copy and e-mail for at least the first assignment turned in so to verify that all of the students know how to do so. 30 I would rather have a hard copy so I can write on it. 52 1 would require paper form as well, but i would allow email turn ins for those who go on break early. 38 I would send a confirmation e-mail to students to verify I received their papers. 67 I would think for the most part it would be trustworthy, but I would probably have students turn in a hard copy, just to be sure. 135 If any problem comes about you can always ask the student to sign into their e- mail account and see if they "sent" it. 93 If it saves paper, that's great. I don't like reading papers off of a screen though, so it might end up with me printing them off myself. I would try my best to have them not be printed, however. 152 44 If the students were unable to send the paper or the paper did not send properly, the students could just resend the first copy. The first copy will have the original date it was sent. 32 I'm really big on helping the environment in any way I can, and submitting papers electronically is a great way to help save paper. 119 It allows for much more excuses then if you would have a hard copy 86 It is very easy for a student to make a mistake while submitting or for them to not have the adequate computer skills to do so. 33 I've had problems with this. Papers are easier to read on paper than on a screen anyway. 2 No e-mails get lost, and a hard copy is easier to grade 128 Not all students even in our time have complete access to interent. Other options would have to be open to those students 56 Rather have paper copies though. Seems more final. 75 Some students may have difficulties and since the students must have an e-mail to send an e-mail, I will e-mail them if I do not receive the paper. The spring break bit causes a bit of problem (what if the students go on a vacation and cannot check their e- mails?) In this case, I would have students submit the paper two days before break so that 1 can check to make sure that I have all of the papers and 1 can notify students (in private) if I haven't yet received their papers. 26 There are going to be students that didn't finish the assignment and will will place the blame on the electronics 40 There will always be students that say they sent their paper and it is easier to do hard copies or do both. 25 Various email programs are pretty consistent. However, the amount of email received could possibly crash my own account. I would allow students to turn in a paper electronically, but I would make that due date the night before it would be due physically, to avoid problems. You teach in a school where all your students have access to the internet. Your students have just finished a short homework assignment. You can either have them turn it in on electronically or on paper. Do you tell students to turn the paper in electronically? 78 *See answer 35. . 75 Again, I can talk to students about homework assignments not received. If the same students habitually are having "problems" and not sending in the homework assignments on time, then I might have a chat with them (they may be taking an extra day to complete under the guise that the e-mail had problems). 86 Again, there is the possibility of a mistake but it is a short homework assignment, not an important paper before spring break. 40 Always will be students who say they sent it. Easier to use paper copies. 93 As long as I have properly taught my students how to use the email system, I trust it. 88 For the reason cited above, receipt can be verified so my students know for certain that it arrived in my box. 153 14 I think it can be easier to turn in papers electronically, and have no problems reading them on a computer. 100 I trust e-mail to get me the assignments, but if it's a short assignment I don't see why they can't hand it in manually. 74 I will also make sure to tell the students to save their work 67 I would probably request a hard copy if I don't receive the homework, by a certain date. 77 In the case of short homework assignments, I would rather have paper - in most cases, it makes it easier for the students, especially those who may not be able to make it to a computer the day of class. Plus, I do not want to have to sort through my inbox to find student assignments - too easy for things to get shuffled. 128 Neither option is bad. Electronically is easier to keep track of but some students may have problems. 32 Same as above. 70 See above. 87 Since I will be teaching Chemistry, most of my homework assignments will involve math and therefore typing them would not be the best option. 98 Stacks of assignments cannot compare with the convenience of having digital copies of files in one folder. 33 With a short due date it's even more prone to problems. 47 Yes, unless they do not have access to the intemet. In your class you can either give a lecture or use a DVD to teach a lesson on fractions. It is a Friday, and the students will be tested on the material the following Monday. Do you try to show the video as part of your lesson? 26 I would want the students to learn from what my lesson plans, the movie might teach unnecessary information or leave out parts I may think are important. I wouldn't be worried about whether the dvd would work but I would opt for something more interactive 78 Unless you check the DVD and TV to make sure you can get all components to work prior to class, there is a likely chance something won't work, or you'll waste time trying to get it working. That is detrimental since they have a test approaching. Also, while it does engage some students more, it doesn't allow for the teacher to answer specific questions or to check for understanding. 75 I feel that it is very important to have face-to-face contact when able, but because the DVD is only one part of the lesson, I could integrate it (especially if it was more effective than I). 86 Sometimes movies are the perfect catalyst for napping in class. 40 Easier to give a lecture and be able to take questions and explain throughout the lecture. 93 Person-to-person interaction is more affective teaching in general. Questions can be answered as they come up. If the DVD were short and very well done I would incorporate it into my lesson but it would not replace my lesson. 88 It's not a matter of trusting the DVD, I just doubt that it could do a superior job. 154 14 I would make sure the DVD was adequate beforehand. Also, depending on the length of the DVD I would try to lecture as well, and strengthen the concepts as necessary. 100 I'd rather teach myself and not have to use DVDs. Then I can show my students that I know the material. 74 I would rather teach the lesson directly. It will give the students more of a chance to ask questions and be interactive. 67 I would probably try to do a mix of video and lecture-first, to make sure the students understand the material fully and, second, to try and curb boredom. 77 128 Without reviewing the dvd there is no way to know. Usually some sort of discussion would have to take place to be a good learning experiance. 32 I'd rather lecture because it's easier to pause a live lecture than a DVD if there are any questions. 70 There is always the chance that it will not work, so I would prefer to teach the lesson myself. The video could be more supplemental. 87 Using a DVD might engage students who usually don't pay attention. 98 As long as the DVD is short, it will break up the lecture nicely. If it fails, I could always just keep talking. It can't fail if I don't try it. 33 47 I believe it would go smoothly as long as I reviewed it and felt it was beneficial for the students before hand. 89 If the video helps the students I would definitely try to fit it in otherwise no. 23 Not too hard. 96 DVD's cant monitor and gauge students understanding! 3 I trust the DVD to work, I just don't trust it to adequatley teach the material since the students are being tested on it the next week. 48 In my college classes my teachers have had trouble getting thier dvds to work. I know better than to completely trust technology. I would try the DVD but have a backup plan just in case it didn't work. 42 53 i wouldn't use it as the only part of the lesson, but it could be benificial to use as part of the lesson since students learn in different ways 64 Stuff happens some times and the video system might not work 27 I would have back up lesson plans in case the DVD does not work for some reason. 30 A DVD does not understand the different learning patterns of students and it does not answer all questions. 52 other explainations sometimes allow for different understandings or connections that could help out a student. I would show it at the beginning so the students will pay attention and not check out for the day. 38 I would integrate it into the lecture because some students need visuals to learn. 135 I am answering this question as if I did not have to show the ENTIRE video. Where I could show parts that I felt were relevent. I would try to pair it with lecture. I also understand that people learn in different ways and lecturing is not always the 155 best/most effective way for all students to learn. Plus during the lecture the students can ask questions. 44 If I had used the video before or knew that the video would aid in the lesson I would use it. However most students are very lazy on Fridays so I probably wouldn't use it. 2 I would have to watch the DVD first. Depending on the quality of the DVD I would allow the kids to watch earlier in the lesson. I would much rather have a review session where we discuss fractions that way students can ask whatever question they have and the DVD would take away time from this. 25 DVD players and discs themselves are pretty good about working properly, but a DVD would take more time to teach the subject than I would, I believe, and given the short amount of time, it seems a waste to show it. 58 I think a DVD would be a good tool in conjunction with other activities for the day. 107 If i teach it myself I will better be able to detect student learning and clear up more misconceptions. 61 I would use a combination of lecture and DVD. Media is a teaching tool, not a primary source for a lesson. 19 45 I feel students could learn better when they're able to ask questions. You can't question a DVD. Even though the info will be factual, you can't replace teaching with a DVD. 35 I'll be out of a job. 73 The students could have more interaction with the lecture, and ask questions ea31er. 83 I find actual teaching to be much more interesting and dynamic than someone else teaching a lesson on a DVD. 92 Obviously, I would pick one that I have screened and I feel can adequately cover the material the students would be tested on. 130 Might use it as a suppliment, but I would not rely on the DVD for something they will be tested on the next class period. In your class you can either give a lecture or use a DVD to teach a lesson on fiactions. It is a Monday, and the students will not be tested on the material. Do you try to show the video as part of your lesson? 93 Same response as above. 135 I would not approach it any diffemtly as stated above. 89 I think that its good to change up the instructional activities. 70 See above. 78 Again, I only trust it if I have put in the effort to check it before class. DVD's can be very helpful in engaging certain students. 88 I feel that much of the hit-or-miss with technology in the classroom is teacher familiarity... One reason it blows my mind that MSU requires some of the more dated 156 classes like SME401 but doesn't require something as useful as CEP416. Seems completely backwards... 128 Things like dvd's can be good tools for information and for learning 2 Refer to 43. It depends on the video. 86 I would try to incorporate it if they wouldn't be tested on the material, but I think there are better ways to encode the information. 40 Easier to give a lecture and be able to take questions and explain throughout the lecture. 32 Same as above. 47 Same as above. 3 Again I trust the DVD to work, it is just hard for DVD's to connect to students in a meaningful way. I would preview the DVD to see the content that was involved and maybe use it to supplement my lesson. 13 I would do this as long as I had time in my curriculum to review it over with the students myself. 38 I would be more likely to use this when they won't be tested for it directly after. 44 If I knew the video would interest the students and aid in their understanding of the topic I would use it. Your school has just signed up for a new web service. You are going to the computer lab this Friday for Health class, where students will use the internet to look up different topics related to their family Health history. You find out right before you are going to the lab that this new web service, unbeknownst to many of your colleagues, has been selling information about what sites students visit to other companies. 1 l7 Completely unacceptable. Obviously this risk is there on any website, but if I know about it with certainty than there is no way that this site will be used. 25 Data mining on minors in public schools is completely unethical. 93 Does it truly affect my students? (Besides in the long run)? If I thought the intemet lesson was a great one I wouldn't cancel it based on this information. 135 Family health histories? Enough said. 87 I don't feel comfortable with the idea of my student's information being sold on the intemet. 100 I don't think that schools should sell information about their students, or use products that do. 89 I don't think that where the students go online should be sold to other companies unless everyone involved knows and agrees. 74 I don't think this could necessarily hurt the students... 70 I think this is self-explanatory. 14 I would be weary about using a program that breaches the privacy of students. I would also try and bring the issue up with the school that signed up with it, and discuss whether or not it is the best option in this case. 85 I would have to contact the designers to understand their third party relationships and look over their privacy policy. (Should have been done by schools IT Dept. ) 75 I would try to get more information and to talk with others about their knowledge. 98 I'd also be questioning the morality of the school I'm working for who didn't read the fine print before installing strange software. 78 If they are selling this information, as an educational service, I doubt they are interested in anything more than profit. This information is private and sensitive to the individual student. 88 Not just for question #46, either. I don't trust any corporation to have my student's best interests at heart. I trust corporations to have their bottom dollar at heart. This makes them very predictable... Kind of like journalists in this respect. 128 Not really enough info to make a good judgment. To many factors to weigh. 2 Students deserve their privacy. There are enough public resources that would not infringe on students rights. 23 Students' health information is private. 58 This is violation of students privacy, plain and simple. Your school has just signed up for a new web service. You are going to the computer lab this Friday for Math class, where students will use the internet to look up and try to figure out the diflerent rules for adding and subtracting fractions. You find out right before you are going to the lab that this new web service, unbeknownst to many of your colleagues, has been selling information about what sites students visit to other companies. 128 Again not enough info, but looking up fractions is less intrusive then health issues. ‘ 100 Even though the topic isn't that important, I still don't like the idea of my student's privacy being invaded. 85 Here the information may not be as intruding on personal information but the precedent is necessary to keep in mind. (Again if the IT Dept has not been informed of a privacy policy then this is a mute point.) 3 I don't like the idea of third parties in any context. 44 If the students were not putting in any valuable information, then I don't see the harm. Many sites relay this information to their advertisers. 23 Information of my students shouldn't be sold. 78 Much like 47, I don't trust the service. However, this information is much less controversial and sensitive in nature since all students will be exploring the same information. 2 Refer to 47 89 same as 47 93 Same as above. 88 See #47. 70 See above. 77 The corporate world is money-hungry, power-hungry, and often unconcerned about the individuals involved. 135 This is more academic. What could harm my students by having information about their habits of visiting math websites? 134 This might allow companies to know what sites are useful to students learning about math and which sites they should possibly fund or advertise 158 Comments? 88 A very fascinating survey. I wish you luck in your work. 60 Angel tends to have many problems and has proven to be a hassle. I do not like having to submit assignments online because I never know for sure if they went through. 105 At the end of this research, I would be interested to see the results. 74 I didn't know what mainframe means 9 I feel like a lot of technological issues I have witnessed in my classes at Michigan State revolve around teachers not necessarily knowing how to use them. 123 l have really enjoyed working with technology, individuals must be cognizant of the fact that there are always issues with technology, and you should prepare alternative lessons. Also, you should be aware that sometimes technology is not the best way to and some sources are not accurate. 84 I think that Angel is not a good resource for instructors to use at MSU. It is not a very reliable system. 127 I think that many of the older teachers in elementary classrooms do not have proper computer education, but with my generation of teachers we will have that knowledge to incorrporate it into the classroom instead of having just the computer lab do so. 115 I think the risk part needs to be more specific. I do things that society and my parents don't approve of, but it isn't bad stuff...just belief wise, what I decide to make a priority, going against what my friends want to do, etc. 89 I think the technology at MSU is good but Angel can be temperamental, but I think its because they are merging with another company and trying to figure that out. Although there are issues with technology, and it is frustrating, it is still very helpful and useful in teaching and also learning. 59 I thought that this was a very interesting survey! 75 I would like more skill training with technology, especially using the technology in the classroom. 33 I'm tired of being asked to blog on angel every week of every semester. The responses are always forced and I don't get a lot out of reading them. 7 Most of my technological problems are due to Angel. 86 Overall, I would say that I do trust technology and it is a great thing that we should embrace, but in the end, human interaction trumps all. We should use technology but cannot become dependent on it. 131 Students need to be taught and shown how to determine whether or not a website is credible for research. Many teachers (at MSU) dislike Wikipedia - but for students who are totally unaware of baseline subject matter - it helps us begin to understand the foundations of a subject or content area in a relatively quick and easy way. 12 Technology seems to be very useful in classes, but some bugs need to be worked out of ANGEL first. 24 Technology will never be 100% reliable, and I think I am glad about that. 76 Thank you and good luck with your work! 134 Thanks, this survey wasn't bad. It really helped me think about technology and if 1 would use it in the future. 159 128 The questions are a little slanted. I feel like there is an objective of the survey and also a lack of information about technology and technology in the classroom. This survey fed to many students could turn out with unreliable remarks. I am worried that this survey will not show exsactly the feeling about technology in the classroom 77 To be honest, I'm not even sure what the term "mainframe" computer refers to!! 23 While technology is great and all- it shouldn't be the only way to encourage students to learn. Yes we should move along with updating technology and our classrooms, but over-sensationalizing technology will make students wonder why they have to learn outdated methods (math with calculators). 160 REFERENCES 161 REFERENCES AACTE. (1999) Teacher education pipeline IV: Schools and departments of education enrollments by race, ethnicity, and gender. Washington DC: American Association of Colleges of Teacher Education. Alvarez-Torres, M., Mishra, P., & Zhao, Y. (2001). Judging a book by its cover. Cultural Stereotyping of interactive media and its effect on the recall of text information Journal of Educational Multimedia and Hypermedia, 10, 2, 161- 183. Anderson, S., & Maninger, R, (2007). Preservice teachers' abilities, beliefs, and intentions regarding technology integration. Journal of Educational Computing Research, 37(2), 151-172.) Attewell, P. (2001). The first and second digital divides. Sociology of Education, 74(3), 252-259. Baier, A. C. (1986). Trust and antitrust. Ethics, 96, 231-260. Bandura, A. (1997). Self-efficacy: The exercise of control. New York: Freeman. Blumer, H. (1962). Society as symbolic interactionism. In A. Rose (Ed.), Human behavior and social processes. London: Routledge and Kegan Paul. Brookhart, S.M., & Freeman, DJ. (1992). Characteristics of entering teacher candidates. Review of Educational Research, 62(1), 37-60. Bryk, A.S. & Schneider, B. (2002). Trust in Schools: A Core Resource for Improvement. New York: Russell Sage Byrum, D. C. & Cashman, C. (1993) Preservice teacher training in educational computing: problems, perceptions and preparation, Journal of Technology and Teacher Education, 1, pp. 259-274. Cohen, D.K. (1988). Educational technology and school organization. In Nickerson & P. Zodhiates (Eds.), Technology and Education: Looking toward 2020. Mahwah, NJ: Lawrence Erlbaum Associates. Coleman, James S. (1990). Foundations of Social Theory, Cambridge, MA: Harvard University Press. Collins, A. and Halverson, R. (2009). Rethinking education in the age of technology. New York, NY: Teachers College Press. Cuban, L. (2001). Oversold and underused: Computers in the classroom. Cambridge, 162 MA: Harvard University Press. Cummings, L.L., & Bromily, P. (1996). The organizational trust inventory (OTI): Development and validation, in Kramer, R. and Tyler, T. (Eds), Trust in Organizations. Thousand Oaks, CA: Sage. Dennett, D. C. (1987). The Intentional Stance. Cambridge, MA.: MIT Press. Dunn, S. & Ridgway, J. (1991). Naked into the world: IT experiences on a final teaching practice: a second survey. Journal of Computer Assisted Learning, 7(2), 229-240. Deutsch, M. (1958). Trust and suspicion. Journal of Conflict Resolution, 2, 265-279. Ertmer, P. A. (2005). Teacher pedagogical beliefs: The final frontier in our quest for technology integration? Educational Technology Research and Development, 53(4), 25-39. Ertmer, P. A., Addison, P., Lane, M., Ross, E., & Woods, D. (1999). Examining teachers’ beliefs about the role of technology in the elementary classroom. Journal of Research on Computing in Education, 32(1), 54—72. Ertmer, P.A., Everbeck, E., Cennarno, K.S. & Lehman, JD. (1994). Enhancing self- efficacy for Computer technologies through the use of positive classroom experiences. Educational Technology, Research & Development, Vol. 42(3), 45- 62. Ertmer, P. A., Gopalakrishnan, S., & Ross, E. M. (2001). Technology-using teachers: Comparing perceptions of exemplary technology use to best practice. Journal of Research on Technology in Education, 33(5). Available online at http://www.iste.org/jrte/33/5/ertmer.html Floden, R. & Bell, J. (2006). Professional development for meaningful learning using technology: In E. Ashbum & R. F loden (Eds), Meaningful learning using technology: What educators need to know and do (pp. 180-192). New York: Teachers College Press. F ranken, R.E., Gibson, K.J., & Rowland, G.L. (1992). Sensation Seeking and the tendency to view the world as threatening. Personality and Individual Differences, 13, 31-38. F ukuyama, Francis (1995). Trust: The Social Virtues and the Creation of Prosperity. New York: Free Press, 1995. Gardner, H. and Benjamin, J. (August 2004). Can there be societal trustees in America 163 today? Presented at The Forum for the Future of Higher Education’s Aspen Symposium. Published in Forum Futures 2005, pp 19-22 Garland, K.J. & Noyes, J .M. (2008). Computer attitude scales: How relevant today? Computers in Human Behavior, 24, 563-575. Guskey, T.R. (1986). Staff development and the process of teacher change. Educational Researcher, 15(5), 5-12. Handler, MG. (1993). Preparing new teachers to use computer technology: Perceptions and suggestions for teacher educators Computers in Education, 20(2), 147-156. Hardin, R. (2002). Trust and T rustworthiness. New York, NY: Russell Sage Foundation. Henderlong, J ., & Lepper MR. (2002). The effects of praise on children’s intrinsic motivation: A review and synthesis. Psychological Bulletin, 128, 774-795 Hew, K.F., & Brush, T. (2007). Integrating technology into K-12 teaching and learning: Current knowledge gaps and recommendations for future research. Educational Technology Research and Development, 55(3), 223-252. Hill, T, Smith, N.D. & Mann, M.F; (1987). Role of efficacy expectations in predicting the decision to use advanced technologies: The case of computers. Journal of Applied Psychology, Vol. 72(2), 307-313. Hitlin, P., & Rainie L. (2005). Teens, technology, and school (Data Memo). Washington, DC: Pew Internet and American Life Project. Retrieved from www.pewintemet.org/pdfs/PIP_Intemet_and_Schools_05.pdf. Hosmer, LT. (1995) ‘Trust: The connecting link between organizational theory and Philosophical Ethics,’ The Academy of Management Review 20(2): 379-403. Hoy, W.I(., & Tschannen-Moran, M. (2003). The conceptualization and measurement of faculty trust in schools: The Omnibus T-Scale. In W.K. Hoy & C.G. Miskel (Eds), Studies in leading and organizing schools (pp. 181—208). Greenwich, CT: Information Age. Hu, W. (2007, May 4). Seeing no progress, some schools drop laptops. New York Times. Retrieved from http://www.washtimes.com. lntemational Society for Technology in Education (2008). National education technology standards. Retrieved from http://www.iste.org/Content/NavigationMenu/NETS/ForTeachers/2008Standards/ NETS_T_Standards_Final.pdf. Jones, G.R. & George, J.M. (1998). The experience and evolution of trust: Implications 164 for cooperation and teamwork. The Academy of Management Review, 23(3), 531- 546. Junco, R. & Mastrodicasa, J. (2007). Connecting to the Net. Generation: What higher education professionals need to know about today's students. U.S.A: NASPA. Kamins, M. L., & Dweck, C. S. (1999). Person versus process praise and criticism: Implications for contingent self-worth and coping. Developmental Psychology, 35, 835—847. Kellenberger, D. (1997). Predicting pre-service teacher perceived computer use under differential access to resources. Journal of Educational Computing Research, I 6, 53-64. Kelly, M. (2008). Bridging digital and cultural divides: TPCK for equity of access to technology. Paper presented at the meeting of the American Educational Research Association, New York, NY. Khorrami-Arani, O. (2001). Researching computer self-efficacy. International Education Journal, 2(4), 17-25. Koehler, M. J ., & Mishra, P. (2008). Introducing TPCK. In AACTE Committee on Innovation and Technology (Ed), Handbook of technological pedagogical content knowledge (T PCK) for educators (pp. 3-29). New York: Routledge Kramer, RM. (1999). Trust and distrust in organizations: Emerging perspectives, enduring questions. Annual Reviews of Psychology, 50, 569-598. Lenhart, A., Rainie, L., & Lewis, 0. (2001). Teenage Life Online: The Rise of the Instant- Message Generation and the Internet's Impact on Friendships and Family Relationships. Washington, DC: Pew Internet & American Life Project. Lesh, R., & Lehrer, R (2000). Iterative refinement cycles for videotape analysis of conceptual change. In A. E. Kelly & R. A. Lesh (Eds), Handbook of research design in mathematics and science education (pp. 992). Mahwah, NJ: Lawrence Erlbaum Associates. Levin, T., & Wadmany, R. (2006). Teachers’ beliefs and practices in technology-based classrooms: A developmental view. Journal of Research on Technology in Education, 39(2), 1 57—1 81. Lortie, D. (1975). Schoolteacher. Chicago: University of Chicago Press. Luhmann, Niklas (1979): Trust & Power. Chichester: Wiley MacGillis, A. (2004). Law, software fuel new 'digital divide'. The Baltimore Sun. 165 Retrieved from http://www.baltimoresun.com. Marakas, G. M., Yi, M. Y., & Johnson, R. D. (1998). The multilevel and multifaceted character of computer self-efficacy: Toward clarification of the construct and an integrative framework for research. Information System Research, 9:2, 126-163. March, J .G. (1994). A Primer on Decision Making. New York: Free Press. Marcinkiewicz, H. R., & Regstad, N. G. (1996). Using subjective norms to predict teachers' computer use. Journal of Computing in Teacher Education, 13(1), 27— 33. Maslow, A. H. (1943). A theory of human motivation Psychological Review, 50, 370— 96. McAllister, DJ. (1995). Affect- and cognition-based trust as foundations for interpersonal co-operation in organizations. Academy of Management Journal 38, 24-59. McCrory, R. (2006). Technology and science teaching: A new kind of knowledge. In E. Ashburn & R. Floden (Eds), Meaningful learning using technology: What educators need to know and do (pp. 141-160). New York: Teachers College Press. Mead, G.H. (1934). Mind, self and society. Chicago: University of Chicago Press. Meyer, W. U., Mittag, W., & Engler, U. (1986). Some effects of praise and blame on perceived ability and affect. Social Cognition, 4(3), 293-308. Mishra, P. (2006). Affective feedback from computers and its effect on perceived ability and affect: A test of the computers as social actor hypothesis. Journal of Educational Multimedia and Hypermedia, 15(1), 107-131 Mishra, P. & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017-1054. Mueller, C. M., & Dweck, C. S. (1998). Praise for intelligence can undermine children’s motivation and performance. Journal of Personality and Social Psychology, 75, 33—52. Monaghan, J. (1993). IT in mathematics initial teacher training - factors influencing school experience. Journal of Computer Assisted Learning, 9(2), 149-160. Murphy, C.A., Coover, D., and Owen, S.V. (1989) Development and validation of the computer self-efficacy scale. Educational and Psychological Measurement, 49, 893-899. 166 Nass, C. & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81-103. NCES (2000). Internet access in U. S. public schools and classrooms: 1 994-99. Washington, DC: NCES 2000—086 NBA. (2003). Status of the American public school teacher, 2000-2001. Washington, DC: National Education Association. Oleda—Zapata, J. (2010, January 11). iPods and educational applications have Minnesota students giddy about learning. TwinCities. com Pioneer Press. Retrieved from http://www.twincities.com Oliver, R. (1994) Factors influencing beginning teachers' uptake of computers. Journal of Technology and Teacher Education, 2, 71-89. Pajares, M. F. (1992). Teachers’ beliefs and educational research: cleaning up a messy construct. Review of Educational Research, 62(3), 307-333 Parks, CD. and L. G. Hulbert, (1995), High and low trusters’ responses to fear in a payoff matrix. Journal of Conflict Resolution, 39(4), 718-30. Parks, C.D., Henager, RF and Scarnahom, SD. (1996). Trust and reactions to messages of intent in social dilemmas. Journal of Conflict Resolution, 40(1), 134-51. Parsad B, and Jones J. (2005). Internet Access in US. Public Schools and Classrooms: 1 994—2003. NCES 2005-015. Washington, DC: Department of Education, National Center for Education Statistics. “Patrick Suppes” (1980) in Robert T. Taylor, (Ed) The Computer in the School: Tutor, Tool, Tutee. New York, NY: Teachers College Press. Pintrich, RR (1990). Implications of psychological research on student learning and college teaching for teacher education. In W. R. Houston (ed.), Handbook of research on teaching education pp. 826-857. Macmillan, New York. Pope, M., Hare, D., & Howard, E. (2005). Enhancing technology use in students teaching: A case study. Journal of Technology in Teacher Education, 13(4), 573- 618. Putnam, R.D. (1993). Making Democracy Work: Civic Traditions in Modern Italy. Princeton, NJ: Princeton Univ. Press. Putnam, R. (2000). Bowling alone: The collapse and revival of A merican Community. New York: Simon and Schuster. 167 Quality Education Data (QED) Report. (2004). 2004—2005 technology purchasing forecas, 10th edn. New York: Scholastic Company. Reeves, B. & Nass, C. (1996). The media equation: How people treat computers, television, and new media like real people and places. New York: Cambridge University Press/CSLI. Reid, T. (2007, March 3). Internet porn pop-ups cost this teacher her job and her freedom. Times Online. Retrieved from http://technology.timesonline.co.uk. Rempel, J.K., Holmes, J .G., & Zanna, MD. (1985). Trust in close relationships. Journal of Personality and Social Psychology, 49, 95-1 12. Selwyn, N. (1999). Differences in educational computer use: The influences of subject cultures. The Curriculum Journal, 10(1), 29—48. Shashaani, L. (1997). Gender differences in computer attitudes and use among college students. Journal of Educational Computing Research, 16 (1), 37-51. Shulman, L. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4-14. Simon, HA. (1971). “Designing organizations for an information rich world,” in Martin Greenberger (Ed.) Computers, Communication, and the Public Interest. Baltimore, MD: Johns Hopkins Press. Sitkin, S.B. & Roth, N.L. (1993). Explaining the limitedness of legalistic “remedies” for trust/distrust Organizational Science, 4, 367-392. "trust" The Concise Oxford Dictionary of English Etymology. Ed. T. F. Hoad. Oxford University Press, 1996. Oxford Reference Online. Oxford University Press. Michigan State University Library. Retrieved from http://www.oxfordreference.com.proxy2.cl.msu.edu:2047/views/ENTRY.html?su bview=Main&entry=t27.el 6098 Tschannen-Moran, M. (2001). Collaboration and the need for trust. Journal of Educational Administration, 39. 308-331. Tschannen-Moran, M. (2004). Trust Matters: Leadership for Successful Schools. San Francisco: Jossey-Bass. Tschannen-Moran, M., & Hoy, W. K. (2000). A multidisciplinary analysis of the nature, meaning, and measurement of trust. Review of Educational Research, 70, 547-593. 168 Turkle, S. (2005). The second self: Computers and the human spirit. Cambridge, MA: MIT Press. Tyack, D. & Cuban, L. (1995). Tinkering toward utopia. A century of public school reform. Cambridge, MA: Harvard University Press. US. Department of Education, & Office of Educational Technology. (2004). Toward a new golden age in American education: How the internet, the law and today 's students are revolutionizing expectations. Retrieved January 2, 2008, from http://www.ed.gov/about/offices/list/os/technology/plm1/2004/site/docs_and_pdf/ Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425-478. Warschauer, M., Knobel, M., & Stone, L. (2004). Technology and equity in schooling: Deconstructing the digital divide. Educational Policy, 18(4), 562-588. Willis, J. W. & Mehlinger, H.D. (1996). Information technology and teacher education. In J. Sikula (Ed.). Handbook of research on teacher education. (pp. 978-1029). New York: Macmillan Library Reference. Xie Y. and Shaumann, KA (2003). Women in science: Career processes and outcomes. Cambridge, MA: Harvard University Press. Yamagishi, T., (1986). The provision of sanctioning as a public good. Journal of Personality and Social Psychology, 51(1), p. 110-116. Yamagishi, T. and K. Sato, (1986). Motivational bases of the public goods problem. Journal of Personality and Social Psychology, 50(1), 67-73. Zhao, Y. (2007). Social studies teachers' perspectives of technology integration. Journal of Technology and Teacher Education, 15(3), 311-333. Zhao, Y., Pugh, K., Sheldon, S., & Byers, J. L. (2002). Conditions for classroom technology innovations. Teachers College Record, 104(3), 482—515. Zhao, Y. & Cziko, G. (2001). Teacher adoption of technology: A perceptual control theory perspecitve. Journal of Technology and Teacher Education, 9(1), 5-30. Zumwalt, K., & Craig, E. (2005). Teacher’s characteristics: Research on the demographic profile. In Cochran-Smith, M., & Zeichner, K. (Eds). Studying teacher education: The report of the AERA panel on research and teacher education (pp. 111-156). New Jersey: Lawrence Erlbaum Associates. 169 i i l i i i i '9 l i ii»). l '2 i i ii '8 T i i 8 ii '3 l '6 F. 'j V 0 il l iii3 N j '0 l i Ii i i j i l i i i '2 l i i i i l 111. i i i i l i i l i