EVOLUTION OF A TASK: TRACKING A TEACHER’S TASK-REVISION PROCESS ALONG WITH STUDENT ENGAGEMENT By Ian Solheim A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Teaching English to Speakers of Other Languages – Master of Arts 2018 ABSTRACT EVOLUTION OF A TASK: TRACKING A TEACHER’S TASK-REVISION PROCESS ALONG WITH STUDENT ENGAGEMENT By Ian Solheim In this action research report, I investigate the process of a teacher (that is, me) implementing task-based language teaching (TBLT). I document my own construction of a language learning task and the changes I made as I taught it over and over to new sets of students. I also examine how the changes affected students’ level of task engagement. I conducted the research over six weeks with twenty sets of students at two Chinese universities as part of a Summer English Communication Program. I employed a questionnaire and a teaching journal as data collection instruments. Analysis of student engagement revealed that the teacher’s revisions had limited impact on engagement, while analysis of the teacher’s TBLT implementation process showed that using tasks provided unforeseen context-specific benefits. ACKNOWLEDGEMENTS I would like to thank all the people who made this thesis a success, starting with my committee members, Dr. Paula Winke and Dr. Koen Van Gorp, who helped keep me motivated and excited about the project in addition to providing excellent advice and feedback every step of the way. I would especially like to thank my advisor, Dr. Paula Winke, who was flexible and generous with her time as she advised an unusually large number of MA candidates this year. Next, I would like to thank Allyson Knox, my mentor and stepmother without whom I would not have gone for my MA. Thank you for your vision for me, your unparalleled consulting and guidance, and for the gift of education. My nuclear family, Mark Solheim, Janie Funkhouser, and Julia Solheim lent a supportive ear many times as I faced the challenges of grad school. They always expressed complete faith in me. Thank you for your love and support. Finally, I would like to thank Bob Eckhart and Minru Li, who are behind the outstanding summer teaching program that enabled me to conduct my research. Thank you for your hard work! iii TABLE OF CONTENTS _Toc513032272LIST OF TABLES ............................................................................................. vi LIST OF FIGURES .................................................................................................................... vii CHAPTER 1: RESEARCHING TASK DESIGN THROUGH ACTION RESEARCH ........1 What are Tasks and Why Use Them?............................................................................. 1 How Have Tasks Proliferated? ........................................................................................ 2 Teachers Engaging with Tasks: Tasks are Dynamic ..................................................... 3 Action Research and TBLT ............................................................................................. 6 Learners Engaging with Tasks ........................................................................................ 7 CHAPTER 2: METHODS ..........................................................................................................10 Participants ...................................................................................................................... 10 Materials .......................................................................................................................... 10 Task engagement survey. ...................................................................................... 10 Task materials. ...................................................................................................... 10 Teacher’s journal. ................................................................................................. 11 Procedure ......................................................................................................................... 11 Analysis ............................................................................................................................ 11 CHAPTER 3: RESULTS ............................................................................................................14 What I did/what happened ............................................................................................. 14 Quantitative Results........................................................................................................ 15 Principle Component Analysis. ............................................................................ 15 ANOVA. ............................................................................................................... 16 Qualitative Results. ......................................................................................................... 17 Usefulness from the teacher’s perspective ............................................................ 18 Usefulness from the students’ perspective ............................................................ 20 Participation from the teachers’ perspective ......................................................... 21 Participation from the students’ perspective ......................................................... 24 Manageability from the teacher’s perspective ...................................................... 26 Manageability from the students’ perspective ...................................................... 26 CHAPTER 4: DISCUSSION ......................................................................................................28 CHAPTER 5: LIMITATIONS ...................................................................................................34 CHAPTER 6: CONCLUSION....................................................................................................35 APPENDICES ..............................................................................................................................36 Appendix A: Task Engagement Survey ........................................................................ 37 Appendix B: Early TV Task .......................................................................................... 38 Appendix C: Late TV Task ............................................................................................ 39 Appendix D: Music Task ................................................................................................ 40 iv BIBLIOGRAPHY ........................................................................................................................42 v LIST OF TABLES Table 1. Where I taught what, when ..............................................................................................14 Table 2. PCA correlation matrix ....................................................................................................16 Table 3. ANOVA F values.............................................................................................................17 vi LIST OF FIGURES Figure 1. Summary of my revisions and their effect on student engagement. ...............................27 vii CHAPTER 1: RESEARCHING TASK DESIGN THROUGH ACTION RESEARCH What are Tasks and Why Use Them? Language teaching is said to be task-based when teachers use tasks as the organizational unit of instruction instead of linguistic units such as words, structures, or functions. Many definitions have been put forth in the four decades since task-based syllabi first appeared. In practice, teachers often select and employ tasks based on their type. Classifications vary. Willis, (1996) provided a typology including listing, comparing, sorting, problem solving, experience sharing, and creative tasks; while Prabhu (1987), a pioneer of task syllabi, provided three major classifications: information-gap, opinion-gap, and reasoning-gap tasks. Consider how the following definition encompasses the abovementioned classroom activities. A task is “an activity that has a non-linguistic purpose or goal with a clear outcome and that uses any or all of the four language skills in its accomplishment by conveying meaning in a way that reflects real-world language use” (Shehade, 2005p. 18). For instance, an activity where students have to collectively decide on a movie to watch is a task, while an activity where students learn and practice the structures and vocabulary associated with deciding on a movie is not. Many educators have provided guidance for teachers adopting a task-based approach. Willis (1996) proposed a task-based learning framework with the following components: a pre- task phase, a task cycle, and a language focus stage. The pre-task phase includes an introduction to the topic and any necessary preparation for the task. The task cycle is an opportunity for learners to “use whatever language they can muster to achieve the goals of the task,” where the teacher must “stop teaching and just act as monitor” (Willis, 1996, p.1). Finally, the language focus stage includes analysis and practice of problematic student language from the task cycle 1 (Leaver & Willis, 2004). The task cycle is further divided into the task itself, which aims to increase communicative competency and fluency, and the planning and reporting stages, which aim to create a need for accurate language use. One defining characteristic of all task-based syllabi is that they are analytic as opposed to synthetic. Synthetic syllabi present isolated linguistic features or groups of linguistic features one at a time. The expectation is that students gradually acquire the language by re-synthesizing the knowledge that is imparted to them discrete bit by discrete bit. Analytic syllabi, by contrast, present the raw, undissected target language. The expectation is that SLA occurs via students’ ability to analyze the language they are exposed to and induce linguistic rules and patterns (Long and Crookes, 1992). Why use tasks? Supporters of tasks argue that synthetic syllabi are incompatible with learners’ developmental sequences. Presenting language as a sequence of discrete structures and functions which learners are expected to master before proceeding onto the next one ignores that language development is a complex and nonlinear process (Larsen-Freeman, 2011). Even lexical items, which—due to their relative discreteness—may appear good candidates for inclusion in a lexical synthetic syllabus, nevertheless come up short due to issues of selecting authentic texts when constrained by predetermined groups of words (Long and Crookes, 1992). Finally, task- based language teaching (tblt) is fun and promotes increased engagement and empowerment for both students and teachers (Leaver & Willis, 2004). How Have Tasks Proliferated? The evolution of tblt has two eras, pre- and post-Long and Crookes (1992), who established Task-Based Language Teaching (capitalized TBLT). While the TBLT syllabus is 2 distinct from its precursors, the The Procedural Syllabus (Prabhu, 1984) and the Process Syllabus (Breen, 1984), all three syllabi are analytic and task-based. Long and Crookes (1992) identified three problems present in both precursor syllabi: 1) there was no rationale for the content of the syllabus (aka no needs analysis), 2) the grading of task difficulty and sequencing of tasks was unprincipled, and 3) there was no focus on form to draw learners’ attention to their problematic language use and thus facilitate SLA. These three shortcomings of TBLT’s precursors featured prominently in Long and Crookes’ subsequent rationale for TBLT. Since its establishment, TBLT has been implemented, evaluated, and researched in settings ranging from provincial education departments (Van Gorp & Bogaert, 2006), to governmental agency programs (González-Lloret & Nielson, 2015), to university language departments (Byrnes, 2002) to university courses (Mcdonough & Chaikitmongkol, 2007; Yasuda, 2011). Tasks have also come to feature prominently in language assessments (Norris, 2016), as well as in empirical studies and cognitive models of SLA (e.g. Révész, 2014; Robinson, 2001). Teachers Engaging with Tasks: Tasks are Dynamic Tasks, far from being the fixed constants they are often treated as in empirical laboratory studies, are dynamic and prone to variability as they play out when teachers use them in classrooms. In a study examining four teachers of Dutch as a second language adopt tasks from language textbooks in their classrooms, Van Den Branden (2009) observed the following patterns emerge: (1) experienced teachers “show a strong inclination to rewrite the task scenario” (p. 268); and (2) students reinterpreted the task instructions to fit whatever goal they found worthwhile to pursue. The interaction that took place during the task, as well as student 3 perceptions of tasks and performance on tasks, were context-specific and based on the dynamic interplay of teacher, the task, and students. Other researchers also recognize the variable nature of tasks. Foster (2009) referred to tasks as “un-pre-focused” (p.249). TBLT offers opportunities for incidental, individualized learning, but these are spontaneously generated during the course of the task. It is due precisely to this variability that Van Den Branden (2016) claimed “the quality of TBLT cannot exceed the quality of the teachers working with tasks” (p.179). Teachers have received too little attention, considering the indispensable role they play in delivering task-based instruction. Chan (2012) found that teachers’ real-time responses to students’ needs were more important for shaping learning than the actual tasks themselves. The four teachers in her study employed pedagogical strategies to modify linguistic, cognitive, and interactional demands of tasks. The study highlighted the importance of “the teacher’s professional judgment as to how to manage the three dimensions of task demand at different stages of learning through constant adjustment of the task variables to make the task more or less demanding” (Chan, 2012, p.206). Often, when researchers examine teachers implementing TBLT, they portray the teachers as having failed to adhere to its principles. Two articles published in 2016 discuss teachers adopting TBLT and illustrate the contrast between two types of research into teachers using tasks, prescriptive and descriptive. The first article reports on a study conducted by Erlam, (2016) in the New Zealand state school context to investigate the success of a TBLT training program. After receiving instruction on Ellis' (2003) four criteria that constitute a task, less than half of the teachers fulfilled all four criteria when implementing tasks in their classes. Teachers 4 implementing TBLT are shown to be unsuccessful and this indicates a tension between TBLT in theory and TBLT in practice. Another article, a position paper and literature review by Van Den Branden(2016), highlights concerns with a prescriptive reading of “TBLT as a researched pedagogy” (p. 76), noting that teachers asked to adhere to prescribed TBLT guidelines will often experience a so- called lack of success doing so given that teachers might see the guidelines as unsound, and justifiably so, since there is as yet no unified theory of SLA nor a strong case for the applicability of laboratory studies to classroom settings. For this reason, Van Den Branden (2016) advocates for a different type of TBLT research: A more descriptive reading of a “researched pedagogy” would make the main focus of research be the pedagogical actions and decisions that are actually taken by teachers and students in authentic classrooms while they are working with tasks…. …Classroom-based research into tasks-in-action, then, could include virtually any actual use teachers and students make of tasks in authentic classrooms. (p. 177) A teacher that decides to use tasks in his/her classroom cannot be said to be a (strong) TBLT practitioner, because implementing TBLT is a program-wide undertaking that includes a needs analysis and materials development (Long, 2016). While these are beyond the scope of a teacher without program-wide backing, many teachers are nevertheless using tasks in their classrooms. Until these teachers’ institutions decide to embrace (strong) TBLT, teachers documenting their classroom experiences using tasks, although “subjective and impressionistic,” can motivate future, more systematic research (Long, 2016, p. 12). This research could be conducted when “moving forward together systematically in what must be a collaborative effort” (Long, 2016, p. 29) becomes a reality for TBLT (if it ever does). 5 Action Research and TBLT Action research is a type of classroom-based research that teachers can conduct to investigate issues in their classrooms, experiment with new teaching approaches, and empirically evaluate and reflect on the effectiveness of their teaching (Burns & Westmacott, 2018). The following action research studies lend support to Van Den Branden's (2016) claim that “implementation of TBLT is a gradual process of learning, which needs to go through repeated cycles of trying out, reflecting, revising, and trying out again” (p. 175). They also highlight some differences between the implementation of TBLT in ESL versus EFL contexts. The first two articles document single tasks in ESL contexts. (O’Connell, 2015), while not action research, nevertheless reported on a pre-instruction stage of TBLT, the needs analysis and preparation of instructional materials. He detailed the steps taken to create prototypical dialogues for a target task, preparing students for a police traffic stop in the U.S. He concludes by stating the importance of needs analysis and consulting domain experts to ensure useful instruction for learners. Calvert & Sheen (2015) reported on the process of designing, implementing, modifying, and evaluating a language learning task for an English for Occupational Purposes course in the U.S. They documented the specific changes to the task structure, input, and implementation that made Task 2 a success where Task 1 was not. They concluded that in addition to enhancing the instructors command of the approach, conducting task-based action research led the instructor to view TBLT more positively. The next two articles report on teachers in EFL contexts teaching entire task-based courses, not just single tasks. Harris (2014) used a task-based approach to teach her first-year Korean students, documenting the process. First conducting a needs analysis, and then determining, adapting, and revising the seven-week syllabus as she went, she perceived the need 6 for more “pre-communicative” activities than most some TBLT advocates would espouse. This action research revealed deviations from prescriptions of TBLT that the teacher nevertheless found beneficial. Huang (2016) implemented TBLT with Chinese university students in their core requirement English course. Contending with issues such as students’ excessive mother tongue use, off-task behavior, and “bad performance on presentations” (p. 121), she ultimately concludes that a mix of task-based and more traditional teaching methods might be the best approach. Recognizing her own lack of experience and competence carrying out TBLT, she documented the specific measures she took to make implementation work for her. Learners Engaging with Tasks Just as teachers’ beliefs and judgment can affect how TBLT plays out in classrooms, so can learners’ levels of task engagement. TBLT involves cooperative and collaborative learning (Long, 2016), so one factor that contributes to the effectiveness of tasks is the learners’ task engagement. Researchers agree that that learner engagement is necessary for language development but have reached no consensus about what it means to be engaged or how to measure it. Taking a firm empirically-based approach, Dornyei (2009) examined interlocutor dyads and took size of speech (measured by the number of words) and number of turns during communicative tasks as an index of students’ task engagement. He justified this approach by claiming that: If students are not actively involved in instructional tasks and do not produce a certain amount of language output, L2 learning is unlikely to be effective in developing communicative competence” (Dornyei, 2009, p. 363). 7 Developing communicative competence is only part of the picture. Engagement also depends on motivation and overall classroom dynamics. Two of the abovementioned action researchers adopted TBLT for reasons beyond increasing language proficiency. (Huang, 2016) targeted students’ study motivation and language proficiency, while Harris' (2014) sole aim in implementing task-based lessons was to improve classroom dynamics, which she found to be successful by triangulating video observations with student surveys. In contrast to direct measurement of student engagement, two researchers designed surveys based on constructs of engagement either they themselves created or adopted from other fields. In an instance of the latter approach, Egbert's (2006) applied Flow Theory in the language classroom as a proxy for engagement. The construct of Flow involves a balance between challenge and skills; if the challenge to skill ratio is too high, anxiety results; if it is too low, boredom results. Learners achieving a state of flow also corresponds to their interest, control, and sustained attention during a task. Egbert (2006) surveyed students about seven different tasks and measured the number of participants in Flow for each one, recommending that teachers can accomplish developing tasks that facilitate the flow experience for students by manipulating the four components challenge/skill level, interest, control, and attention. Philp & Duchesne (2016) offered an original model of the construct of engagement. For them, engagement reflects students’ participation in four dimensions, all of which are interrelated and mediate each other: cognitive, social, behavioral, and affective dimensions. The state of heightened attention when learners are involved simultaneously in all of these dimensions is when task participants are said to experience engagement. 8 In light of the varying notions of engagement and different factors theorized to be involved in promoting engagement and, in light of the complex and varied nature of real teachers using real tasks, my research questions are broad: 1. How did a teacher revise a task to make it more engaging for learners? 2. How did the students respond to these revisions? 9 CHAPTER 2: METHODS Participants I collected data from 222 Chinese students (aged 18-23) in 9 classes at 2 universities studying English communication and culture in a summer intensive program. The class sizes ranged from 25-35. Students’ English proficiency ranged from novice-high to advanced-high, as observed by the author. I am also a participant in my own study; I am reflecting on the decisions I made as a teacher regarding the construction and revision of a task I taught 20 times. Materials Task engagement survey. I administered a survey to measure students’ task engagement. The survey contained 17 items on an 8-point Likert scale, a free-response section, and a brief background questionnaire including students’ age and gender. Task materials. Over the course of twenty lessons, I taught three tasks requiring students to prepare presentations. Below, I describe them. (For the task instructions students received and the materials and artifacts they used in to complete the task, see Appendix B). Task I – Early TV: After giving an overview of television in America in the previous lesson, I showed students eight short YouTube videos. Then, I grouped them and gave them 40 minutes to prepare a short presentation where they react to the videos and compare the two mediums YouTube and television. Task II – Late TV: After giving an overview of television in America in the previous lesson that focused on stereotypes, I grouped students and gave them 40 minutes to prepare a 10 short presentation where they list stereotypes they have encountered in their own lives and discuss the implications of these stereotypes. Task III – Music: After giving an overview of protest music in America in the previous lesson, I grouped students and gave them songs to interpret and prepare presentations on. Students had 40 minutes to prepare their presentations. I provided the music videos and lyrics in group chats using WeChat (a popular Chinese chat app). I required students to interact with me within these group chats as they prepared their presentations. Teacher’s journal. I documented each iteration of my tasks by keeping audio and written notes before, after, and during each class. Procedure During a 2-hour class, I set students to work on a 40-minute task to prepare a presentation in groups. After student presentations, I administered the task engagement survey. I repeated this process 9 times with 9 different classes to gather data on 9 iterations of the tasks. Altogether I taught 20 lessons while at the two universities. Owing to logistical difficulties, I failed to collect data from 11 of the 20 classes I taught. Analysis This is a mixed methods study. I use a qualitative approach to answer RQ 1: How did a teacher revise a task to make it more engaging for learners? I reviewed my audio and written teacher’s journal and my task materials for themes that emerge to guide my analysis. Although I only collected data from 9 classes, I taught 20 classes and will consider them all when answering RQ1. 11 I used both qualitative and quantitative approaches to answer RQ2: How did learners respond to these revisions? I examined students’ free responses on the survey and analyze common themes to speculate about their engagement during each task. For the quantitative element, I created survey items with three scales in mind, usefulness, manageability, and participation, and conducted a conducted a principal component analysis (PCA) on the survey items to uncover any other latent factors that would provide a more nuanced perspective of students' task engagement. Using the results of the PCA, I computed scales by summing the thematically corresponding items, and these scales aligned with my originally conceived factors. I then used univariate ANOVA to compare 9 different classes of students’ task engagement using these scales as dependent variables. My data are not appropriate for a repeated measures design. My lessons were applied to different groups of students over time, and with the same lesson plan (despite minor instructional changes). This lack of repeated measures on the same students over time immediately told me that my data are structured differently from a repeated measures design. Instead, I viewed different lessons as groups of equivalent students that received different instructional "treatments" for which I would like to know whether or not class means for usefulness, manageability, or participation differ significantly from one another. This structure necessitates a univariate ANOVA design. My dependent variables are usefulness, manageability, and participation. I have four independent (predictor) variables. The first is “Group of students” (this is a proxy for time; aka lesson number). The next three represent instructional changes I made: “Outline Required” (asking students to create an outline of their speech as they work in groups), “Outlines Public” (these outlines being made available in a group chat to the entire class immediately before each group presented), and “Feedback 12 Warning” (students being told they would receive feedback on their speeches). Thus, I used univariate ANOVA to partially answer RQ2: How did students respond to task revisions? The qualitative and quantitative elements of my study interplay. As I gathered data about students’ reactions (RQ1), I examined it and used it to inform further revisions to my task (RQ2). 13 CHAPTER 3: RESULTS What I did/what happened Figure 1 indicates where I taught what, when, and from which groups I collected data. Thus, Figure 1 provides a frame of reference for presenting my results. I taught a total of 20 three-day classes at two universities, ten classes at each university. The task was always conducted on Day II, but the task content was introduced on Day I, and the task feedback was given on Day III. Table 1. Where I taught what, when Location Class # Task Course Description 1 2 3* 4* University 1 5* 6* 7* Early TV 8* 9 10 11* Television University 2 16 15 12 13* 14* Late TV 19 20 17 18 Music Music Compare YouTube and television and discuss pros and Discuss stereotypes Analyze and report on a cons of YouTube in China song’s meaning Note: Factor loadings over .40 appear in bold. Values with an asterisk within a column have been assigned to their corresponding factor. The reliabilities reported are the reliabilities of the final scale. I set out to trace the evolution of a task by teaching it multiple times. I expected slight adaptations. In reality, two drastic mutations occurred. I have, therefore, classified the tasks’ 20 iterations into three categories: Early TV, Late TV, and Music. Despite their differences, the Early TV, Late TV, and Music tasks comprised the results of my implementing TBLT over a definite period of time. The ideas that were at play during the creation and modification of these three distinct tasks are all interconnected. Therefore, when discussing the results, I will view all three Early TV, Late TV, and Music as iterations of the same task. “Class” refers to the chronicling of the 9 groups of students I collected data from, whereas “iteration” refers to the chronicling of the 20 tasks I taught. 14 Quantitative Results Principle Component Analysis. I conducted a principal component analysis (PCA) on 11 of the 17 task engagement survey items with orthogonal rotation (varimax). The Kaiser–Meyer–Olkin measure verified the sampling adequacy for the analysis, KMO = .880, (‘great’ according to Field, 2009), and all KMO values for individual items were > .84, which is well above the acceptable limit of .5 (Field, 2009). Bartlett’s test of sphericity χ² (55) = 1055.083, p < .001, indicated that correlations between items were sufficiently large for PCA. I ran an initial analysis to obtain eigenvalues for each component in the data. Two components had eigenvalues over Kaiser’s criterion of 1, one over 0.9 and two more over 0.7. These five components explained 79.3% of the variance. I merged the five components into three for the final analysis based on my own interpretation of the relationship between items, given instances of multiple loadings. The items that cluster on the same components suggest that component 1 represents a perception of task usefulness, component 5 and 3 the manageability of task instructions, and component 2 and 4 student participation on the task. Below, I present the correlation matrix that came with the PCA. 15 Table 2. PCA correlation matrix Item Component number This task was a good learning experience. I could use what I learned from this task while communicating with native English speakers. I learned something about American culture performing this task. I understood what was expected of my group upon completion of this task. I had the skills to complete this task. I understood what was expected of me during groupwork on the task. My group had ample time to complete the task. My group completed everything in the task. I participated more than other group members during this task. My contribution to the group was meaningful. I was able to fully participate in this task. Eigenvalues % of variance α Rotated Factor Loadings Usefulness Manageability Participation 1 0.630* 0.814* 0.838* 0.202 0.155 0.171 0.149 0.154 0.198 0.172 0.232 5.18 5 0.339 0.084 0.161 0.878* 3 4 2 0.082 -0.047 0.509 0.073 0.234 0.141 0.236 0.25 0.169 0.032 0.08 0.195 0.431 0.498* 0.476 0.203 0.577* 0.185 0.542 0.142 0.085 0.206 0.085 0.046 0.286 0.71 0.891* 0.138 0.067 0.668* -0.086 0.487 0.021 0.274 0.136 0.98 0.84* 0.226 0.433 0.715* 0.27 0.78 0.781* 1.09 47.07% 6.50% 8.89% 7.09% 9.90% 0.83 0.8 0.76 Note: Factor loadings over .40 appear in bold. Values with an asterisk within a column have been assigned to their corresponding factor. The reliabilities reported are the reliabilities of the final scale. ANOVA. While the results of my survey showed no significant effect of any task revision on the measures Usefulness or Participation, there was a significant effect of the revision “Outline required” on the measure Manageability, F(8,8)=3, p < .05, R Squared = 0.09. This means that when I introduced the requirement that—in addition to the speeches themselves—students had to 16 create an outline of their speeches, students found the task less manageable. Table 2 presents the F values for all the ANOVA I ran. Table 3. ANOVA F values Independent Variables Group of students (Classes 1-9) Outline required (Classes 5-9) Feedback warning (Classes 5-9) Outline public (Classes 8-9) Qualitative Results. Usefulness Dependent Variables Manageability Participation F(8,8)=0.59, p > .05, F(8,8)=3, p < .05, F(8,8)=1.41, p > .05, R Squared = 0.02 R Squared = 0.09 R Squared = 0.47 F(1,1)=0.4, p > .05, F(1,1)=11.31, p < .05, F( 1,1)=0.33, p > .05, R Squared = 0.02 R Squared = 0.45 R Squared = 0.00 F(1,1)=0.4, p > .05, F(1,1)=11.31, p < .05, F(1,1)=0.33, p > .05, R Squared = 0.02 R Squared = 0.45 R Squared = 0.00 F(1,1)=0.19, p > .05, F(1,1)=0.3, p > .05, F(1,1)=1.89, p > .05, R Squared = 0.01 R Squared = 0.00 R Squared = 0.01 In this section, I will address both RQ1 examining the teacher’s decision-making, and RQ2 examining the students’ responses. My post-hoc examination of the students’ open-ended survey responses revealed a sizeable gap between teacher and student perceptions of the different task iterations. Students and the teacher (that is, me) were on different wavelengths regarding tasks strengths and weaknesses. Therefore, the three themes Usefulness, Participation, and 17 Manageability comprise this section, with each theme further divided into two parts: teacher and student perspectives1. Usefulness from the teacher’s perspective Starting with the sixth task iteration, I made the following revision: I began taking close notes while students gave their presentations on Day II, noting the problematic language I heard. Day III would then always begin with a Focus on Form session. Having categorized the language from the students’ speeches into errors either of intelligibility or word choice, I went through errors item by item during these sessions. Incorporating Focus on Form helped improve my class in many ways, so I retained this revision for future task iterations. The addition of feedback after the task provided two main benefits: increased usefulness; and increased student accountability, Increased usefulness Receiving feedback on their speeches made the instruction more useful for students. At its core, the addition of feedback rendered the task a garden path activity for students. The purpose of the task became to elicit large speech samples that I would comb through for errors that: 1) I perceived as systematic among the students and 2) I believed would hinder communication with native speakers. The feedback sessions then became a chance for me to illuminate the error types and provide students with meta-linguistic strategies for avoiding them. The errors I perceived as systematic were inappropriate register and mispronunciation, which often co-occurred. The reason was clear: Students were looking up words on their phones. They were searching for English translations of Chinese words in formal register, likely chosen 1 I did not collect student data for the Music task (iterations 15-20) 18 because Chinese classrooms tend to exhibit more formal language than American classrooms. The students’ phone-dictionary searches returned mostly unfamiliar English words, and, therefore, these words were often mispronounced on top of being inappropriate in register, a co- occurrence that stood out as a tangible area for improvement I could target. Increased accountability Both the close-listening and note-taking on Day II and the subsequent feedback sessions on Day III made students more accountable for contributing positively to the class. Before I started note-taking, students had been quietly mumbling through presentations as if unconvinced anyone was really listening. My close listening and note-taking signaled to students there was a teacher trying their best to attend to their individual needs. It also made it apparent that there was a true listener for their speech. With more of a real reason to communicate, students tried harder to do so, speaking more loudly and clearly. I and others teaching on the program often commented how difficult it was “getting students to talk.” The feedback sessions increased classroom accountability because I now had a record of who had said what. I could ask students (otherwise too shy to volunteer) what they had meant by difficult-to-interpret, non-target-like language from their speeches. I was trying to create a classroom where students viewed errors as an opportunity to learn instead of as a failure on their parts. Even when I could interpret students’ problematic language myself, I would sometimes ask about it regardless in order to initiate Socratic-style dialogues that other students could learn from. In addition to socializing EFL students into Western-style classroom participation, the feedback sessions provided students with opportunities to negotiate for meaning and learn from errors that they had made themselves. The sessions also provided a segue into teaching circumlocution, a meta-linguistic strategy I deemed essential in light of 19 students’ over-reliance on phone dictionaries. These benefits were only possible because the record of who said what raised student accountability and encouraged them to own up to their language. Usefulness from the students’ perspective As stated above, my main revision to increase Usefulness was the incorporation of feedback. Unfortunately, because the feedback sessions occurred the lesson after I gave the survey, I was unable to measure students’ reactions to my feedback sessions. However, the student quote below echoes my view that the close listening and note-taking were beneficial. “When we’re discussing, you try to record or words. That’s really make me moved. Thank you.” Student, Early TV task What I did find was that students valued learning about America, including the YouTube videos I showed them for the Early TV task. Students expressed that they enjoyed watching the YouTube videos and were able to learn about America by doing so. In the section below, I discuss a revision that I thought increased the cross-cultural component of the task, which was that instead of having students view YouTube videos, students drew on the TV shows from Day 1 and drew upon the stereotypes present in them for their speeches. Some students commented that the Late TV task was boring, which to me indicated that students found the task more useful when they were learning about America than when they were sharing about China. 20 Participation from the teachers’ perspective I tried to improve student levels of participation by increasing interaction in three ways: between teachers and students during the task phase; between students presenting and their audience during the reporting phase; and among student members of groups during the task phase. Increased interaction between teacher and task participants As someone who is familiar with many EFL students’ hesitation to speak up in class, I felt certain that my students did not always understand everything in the instructions and nevertheless neglected to seek clarification. To provide a low-stakes way to ask the teacher questions, I set up a table in the corner of the room and called it a “help desk.” Whereas during the early iterations, I devoted substantial time to verbally relaying the task instructions with accompanying slides, providing the task instruction slides to students within a WeChat group resulted in more teacher/task participant interaction, here via clarification questions. This was perhaps because students felt more comfortable asking questions when they could reference specific language they were having trouble interpreting. However, the help desk’s primary purpose was to encourage students to utilize my expertise during the task phase, because I considered myself students’ number one source of linguistic and cultural knowledge for use in their speeches. After 15 iterations of less than desirable amounts of student-teacher interaction during the task phase despite the presence of my help desk, beginning with the Music task, I began to see more. I attribute this to two factors: The input that was the basis for the task became more difficult and I changed the channel of communication between task participants and teacher to a digital format. 21 Here is a description of the Music task: On Day I, we discussed protest music. On Day II, I gave groups of students protest songs to interpret and prepare presentations on. I provided the music videos and lyrics in WeChat groups. The historical context and the lyrical content of the songs was difficult for students. While they may have had the frame of protest music to guide their presentations, nevertheless I knew they would need explanation from me to arrive at a reasonably accurate interpretation of the songs. I therefore stipulated that students must ask me at least three questions related to the historical context or lyrical content of their songs in the WeChat group. Using the digital format, I was able to provide links to online readings, pictures, text, or voice messages in response to their questions, and I was easily able to manage this for all five groups simultaneously. I would watch as students in a group all listened to my voice messages multiple times, consulting to interpret their meanings. During the music task, I was engaged with students in this way during almost the entire task phase, which was a dramatic increase from prior lessons. Increased interaction between presenting students and their audience I made two revisions to increase the audience’s incentive to listen attentively, because I thought having the audience listen attentively to groups’ presentations would give presenters a real reason to make their language as comprehensible as possible and would give the audience an opportunity to practice listening. Building on the revision requiring outlines (discussed below), I made the groups’ outlines of their speeches public, making them available to the whole class by posting them in the WeChat group. Students could view the outline, and thereby get an idea of the content of their peers’ speeches before listening them. This was meant to pique the students’ interests in the 22 speeches and make them easier to follow. My hope was that this, the first of two revisions, would result in a more attentive audience. In the same task iteration when I made the outlines public, I also overhauled the Early TV task. In the Early TV task, students make a comparison between American television and YouTube videos, including arguing their position for or against censorship on YouTube. Student responses to the five questions in my Early TV task prompt tended to be relatively homogenous. For example, the Early TV task prompt asked “who might be interested in these [YouTube] videos and why do you think they are so popular?” The Late TV task prompt, by contrast, was not only more open-ended, but also more thought-provoking. For example, the Late TV task prompt asked, “how are stereotypes harmful or useful and what are appropriate and inappropriate ways of stereotyping?” (for more details, see appendix B). This overhaul of the Early TV task, which allowed for a wider range of more intriguing student responses, was meant to increase interest among the audience. This included not only the students, but also me. Thanks to the revision, I was now a part of the audience in a way I had not been before. I had spent Day I introducing students to some common American stereotypes present in television shows, and now I was asking them to reciprocate by introducing me to stereotypes they were familiar with in their own lives. I was no longer only listening for linguistic features of students’ speeches; I was learning about China. This cultural exchange component of the Late TV task made it more meaningful for me as a teacher, which for the students almost took it out of the realm of pedagogical task and into the realm of actual task: tell me about China. 23 Increased participation among student group members When I made the revision to include the outline requirement, I wanted students to understand why. Thus, in addition to defining an outline in my task instructions, I also included my rationale for requiring one. Below I give excerpts from the task instructions. “Why Provide an Outline? •It helps avoid redundancy. You and your group must cooperate to plan a coherent and cohesive presentation. Your classmates will be able to see your outline as you present. •If you decide to speak in English with your group while planning, planning a single coherent presentation will increase communicative burden, which will increase interaction, which helps your language development.” Task Instructions, Early TV task As can be seen above, it was my thought that the outline requirement would increase interaction between students during the task phase. Participation from the students’ perspective The student quote below demonstrates a fundamental problem with my task: a lack of interaction during the task phase. “If we are giving five questions to five people and ask them to figure them out in a short time, people will focus on the question they are given and show little interest in other questions, that is a problem.” Student, Early TV task The student quoted above chose to say in more precise terms what many other students suggested by writing that they desired more “interaction” and a more “active” task. These comments persisted through the Early and Late TV tasks. The reason is that in both tasks, 24 students spent 40 minutes preparing their presentations (see appendix B for more details). The burden on students to remain motivated for such a long span of time was probably too great. That burden may have been eased with more structured opportunities for spoken interaction. I sought to promote interaction via the outline requirement, but students appeared to want something else; the quote below is of many student quotes expressing a desire for more “games.” “You may need some games to make students together.” Student, Late TV task I interpret the desire for more games as a desire for a more structured way for students to interact, because games in the language classroom usually refer to activities that motivate students to communicate by offering a fun challenge. Although I did not collect data from students for the Music task, I anticipate students would not have expressed the same desire for more interaction, since I provided opportunities for interaction during the task phase of the Music task (opportunities that were not present in the Early and Late TV tasks). One source of interaction was with the teacher directly via the digital help desk. The other was not interaction per se, but rather input. In the Early and Late TV spent 40 minutes producing output about input they had seen previously. In the Music task, students were simultaneously tasked with processing input (music and lyrics) and producing output. This dual requirement served as a practical substitute for interaction. 25 Manageability from the teacher’s perspective Students seemed to have no problems completing the task, so I did not aim to increase overall manageability with any of my task revisions. Instead, I was intent on achieving the effect illustrated by the following student quote: “I like the assignment that you give us, but it’s too less time. The question may be too much, or the time to prepare was a little short, we don’t prepare it at the best, and a little nervous when we start talking.” Student, Late TV task Recall that I mentioned my concern was that students “had been quietly mumbling through presentations as if unconvinced anyone was really listening.” My thinking was that if students had either less planning time or a more demanding task, they would have to improvise somewhat during their speeches, which would preclude “mumbling through.” Recall also that I observed students using words that were low-frequency and inappropriately formal in register. It seemed to me that if students were improvising parts of their speeches, they would be forced to use circumlocution, producing language that was less formal and more level-appropriate. I therefore decided to revise the task to make it more demanding (i.e., less manageable) when I overhauled the Early TV task and created the Late TV task. Manageability from the students’ perspective The most prevalent student reaction to manageability involved the time allotted for completion of the task. Students’ free-response comments indicated that too much time was allotted for completion of the Early TV task while too little time was allotted for the Late TV task. Students also disliked the outline requirement, but only in the Late TV task, co-occurring when students claimed there was too little time 26 Figure 1. Summary of my revisions and their effect on student engagement 27 CHAPTER 4: DISCUSSION I designed this project to investigate more closely how teachers design tasks for their language classrooms. Task design is an iterative and highly involved process (O’Connell, 2015), and I wanted to take an in-depth look at the task-tasks, implementation, and revision process from the teacher’s perspective, which I believe is research in the task-based literature that is lacking. In this study, I conducted task-based action research, or, in other words, I conducted teacher research in the classroom. As described by Burns (2013), who was summarizing Baumann and Duffy-Hester, I took an emic, or insider perspective on task development. Action researchers “have unique, situation-specific participant roles in inquiry;” they blur “theory and practice as praxis,” mixing “reflection and practice within the research process;” “whereby teaching is subject to inquiry, exploration, and evolution” (Burns, 2013, p. 1). In this research environment, I was able to concretely look at task design, and I found that tasks used by teachers are constantly in flux, changing based on students’ input and reflections, and also based on the teacher’s input and reflections. Van Den Branden (2009) noted that teachers revise their tasks, and I did this in rapid succession in connection to one task that I taught 20 times over a 6-week period during the summer of 2017, a unique teaching environment I was afforded in China. The environment allowed me to take a close look at the task-revision process because of the compressed task recycling I had the opportunity to engage in at the universities in China where I was working. As Van Den Branden (2016) observed, implementation of TBLT is a gradual and cyclical process. Becoming a competent practitioner of TBLT is a journey, and this action research allowed me to 28 take my time on that journey; by looking closely at the revision of a single task, I was able to make an extended stop where many other teachers have to press on. Conducting this type of “classroom-based research into tasks-in-action” (Van Den Branden, 2016, p. 177) can hopefully begin to address the question of whether TBLT makes a difference for teachers and learners in real life. Since measuring the exact impact of TBLT is a “methodological nightmare” (Van Den Branden, 2016, p.176), the present study, by contrast, is a real-world attempt to begin uncovering the effects of TBLT in authentic classroom environments. Van Den Branden (2016) suggested areas to investigate via classroom-based TBLT research, and this study touches on some of them, namely teachers’ beliefs about possible ways to raise the learning potential of tasks, how and why teachers design task-based lessons, and how students react to them. Investigating student reactions, specifically their engagement, was a useful framework for reflecting on and revising my tasks. Much like how one teacher in Egbert’s (2006) study claimed that investigating flow in the language classroom sparked new teaching methods, measuring my students’ reactions after each task iteration provided fodder for my subsequent task revisions. Not only did I have more ideas to draw on, but with the added dimension of learner perspectives in particular, the nature of my task revisions changed. Chances to incorporate learner perspectives into task revision is invaluable, since language learning is unlikely to occur at all if the learners are not engaged. For example, while my students perceived 40 minutes of group work as too long, I did not see it that way until after measuring their reactions to my task. After the teaching in China was over, I came back to school to write up this action research report, and I consulted task-based literature in the process. I found myself comparing the lessons I had taught with the prescriptions of task-based language educators. Leaver & Willis' 29 (2004) TBL Framework offered solutions to problems I noticed retrospectively. Their task cycle is divided into three parts: task, planning, and report. By asking students to work together to prepare a presentation about videos from the pre-task phase, I effectively bypassed the task and proceeded straight to the planning phase. Drawing clear boundaries between the task and the planning could have decreased students’ use of their mother tongue (thereby increasing interaction in English), if I had asked them to use English to verbally reach a consensus on the content of their speech (in English) before beginning to write it. Although I explicitly encouraged students to use English to prepare their speeches, I also allowed them to use any mix of Chinese and English necessary to complete the task. Expecting them to use English for the task I assigned was, I realize in retrospect, unrealistic. First, because preparing a speech involves mostly writing. Second, like (Huang, 2016), I observed and embraced that students “felt awkward to speak English with those sharing the same first language” (p.121). I eventually got wise enough to address this by offering myself as a source of digital interaction during the students’ planning stage in the Music task, but should have offered students a structured way to use English with each other as well. If I were repeat this research, I would want to survey students about the extent to which my task promoted English use in our EFL setting, given the integral importance of target language use to TBLT, and given the challenge of achieving lots of L2 use with groups of students sharing the same L1. Straying from Leaver & Willis' (2004) TBL Framework resulted in students’ excessive use of Chinese. By the same token, it ultimately created the conditions for my digital help desk innovation. Another area where I followed the framework with less than rigid felicity but that I thought was ultimately effective was the language focus portion. 30 Much like Huang (2016) during her journey implementing TBLT, I too perceived the need to go over metalinguistic strategies and provide feedback on student language. But because, unlike her, I taught the same task over and over, I noticed clear trends in student’s problematic language use. I made the resulting diagnosis that the aspects of student language most in need of focus on form (Loewen, 2015) were vocabulary and pronunciation. Students responded positively to my first feedback session, so I structured subsequent ones around vocabulary and pronunciation too. With each new group of students, the type of feedback I gave became more and more solidified as I became better at explaining helpful metalinguistic strategies and concepts. However beneficial, my feedback also became less and less responsive. Instead of targeting the specific language from each group of students’ task performance, my language focus sessions gradually came to approximate entire mini-lessons of their own on register (see Loewen, 2015, p.98-99) and pronunciation. I was teaching the same things regardless of the language individual classes of students produced. One interpretation is that what started as an adherence to Long’s (2016) methodological principle (MP) #7 of TBLT (provide negative evidence) spun out of control, initially starting as my learner-driven reaction to persistent student errors in the form of an explicit negative feedback pedagogic procedure (Long, 2016), but eventually resulting my teacher-driven inclusion of a metalinguistic item into my task-based lesson (aka verging on task-supported language teaching). I must confess there were teacher-centered concerns that triggered this decision as well. For one, I structured my register lesson around examples from my additional languages (Thai and Chinese) because I felt that emphasizing my foreign language learning background strengthened my bond with my students and made teaching more enjoyable. I also enjoyed presenting metalinguistic concepts that I believed most students were unfamiliar with and felt 31 had the power to transform their beliefs about language learning. It felt good to blow students’ minds with the knowledge I had acquired over years as a language learner and teacher. Despite the accepted contrast between teacher-centered and learner-centered teaching, I do not view the two as entirely mutually exclusive in this instance. Classroom dynamics benefit when the teacher is excited about what they are teaching. Teachers implement TBLT based on what they see as “practicable, feasible and appropriate” (Van Den Branden, 2016). Inserting my expertise and interests into the lesson felt like an appropriate and practicable way to increase my emotional investment and the students’ interest and motivation, and thus foster positive classroom dynamics. If there was a tension for me between “TBLT as a principled approach and TBLT as it takes shape in authentic educational practice” (Van Den Branden, 2016, p.172), it was that establishing classroom dynamics and classroom norms was paramount. Once I had arrived at a pedagogic procedure that could boost my emotions as a teacher, the students’ level of interest, and the students’ level of accountability, I resisted changing it. Long (2016) noted, “TBLT is still a relatively recent innovation—one whose adoption requires a considerable investment of time and effort if it is to be successful,” and added that “success will be more likely… …where language learning is a serious matter and recognized and supported as such” (p. 28). The two task-based articles that report on ESL contexts both target very specific tasks, carrying out hotel duties and handling a police traffic stop, respectively (Calvert and Sheen, 2015; O’Connell, 2015). By contrast, the two task-based articles from Asia and this article report on courses with broader and less clearly-defined objectives, as these courses are mandatory components of university programs (Harris, 2014; Huang, 2016). Whether officially or unofficially, teachers’ objectives ranged from IELTS preparation (Huang, 2016) to providing intensive English communication and American culture instruction (Li & Eckhart, 32 2018). Language learning is no less recognized and supported as a serious matter in the abovementioned latter EFL contexts than in the abovementioned former ESL contexts, however, adherence to all of Long’s (2016) MPs may be impractical or impossible where full-blown TBLT is not supported. In the meantime, teachers in these contexts can still have their hand at implementing TBLT in their own way, following the MPs to the greatest extent possible. In contexts where target tasks are clearly identifiable via needs analysis, O’Connell (2015) was right when he claimed “it is unlikely that there is anything more stimulating for serious language learners than work with real-life tasks that have immediate and clear relevance to them” (p. 130). In instances where needs analyses and materials development are either too complex or beyond the realm of what a lone teacher can manage, it is only reasonable to expect these teachers to use tasks as they personally see fit, that is, as one tool in an eclectic arsenal, and not as the sole means of teaching language. 33 CHAPTER 5: LIMITATIONS One limitation of this study is the fact that my instructional interventions cannot be causally linked to students’ task responses. This is because each new iteration of the task was given to a new group of students; there was no control group. Another limitation is that although I administered the task and survey nine times, upon returning from the teaching assignment and reviewing the evolution of my task I was only able to identify three quantifiable alterations to the task that I could analyze. More systematic revisions would have yielded better indicators against which to analyze student engagement. One approach would have been modify the task according to psycho-linguistic classification parameters, i.e. “two-way,” “convergent,” and “closed-outcome,” as in the action research by Calvert and Sheen, 2015). These parameters represent the collective wisdom of task-based researchers. Using them to revise my task could have helped me generate ideas that were not only time-tested but also more amenable to quantitative analysis. One caveat to this is that the types of changes that teachers see fit to make in their real-life teaching contexts are often neither neat nor one-size-fits- all (i.e., pre-determined parameters may not apply), and this is one of the tensions between teacher and SLA researcher. The items on my survey may not have captured the engagement well. For instance, the manageability scale was oriented in such a way that more manageability was categorically considered good. Of course, in reality, a task can be too manageable, which can mean bored and disengaged students. Given that my survey did not take this into account, I believe Flow (Egbert, 2006), which examines the balance between challenge and skills, would have been a more appropriate measure of engagement. 34 CHAPTER 6: CONCLUSION This action research report documents my journey in adopting tasks to optimize learning for students and also how the students responded. The most important take-away from my study is that I found task design is iterative, and changes to tasks are made with input from both the teacher and the students. Tasks must be adjusted by teachers to fit the specific needs of the students the teacher is teaching. For example, in this project, in which I was teaching Chinese learners of English, I needed to carefully consider how to create interaction opportunities during the task and planning phases. I also needed to keep students motivated while they prepared their presentations. I did so by adjusting the prompt to allow for more heterogeneity in students’ answers, thereby increasing the audience’s interest. Finally, I needed to promote classroom accountability, and I did so by taking notes about who said what during presentations and including these in feedback sessions. Personally, as an action researcher, I became more confident implementing TBLT as I experimented. In the process, I discovered the unforeseen benefits of using tasks. One benefit was improved classroom dynamics. My task-based classroom was a place where learners received meaningful and illuminating feedback on their language use and where I felt emotionally invested in the process of organizing and motivating my students. Another benefit was that working within a task-based framework spurred my pedagogical innovation by constraining and channeling my approach to meeting the needs I perceived. Overall, I found tasks to be an empowering teaching tool that inspired me to be more thoughtful and challenged me to be more effective in promoting my students’ language learning. 35 APPENDICES 36 Appendix A: Task Engagement Survey English Class: ______ Name: (Optional) _______ Date: _______ I would like to ask you to help me by answering the following questions concerning the task you just completed. This is I. not a test so there are no “right” or “wrong” answers and you don’t even have to write your name on it. Please give your answers sincerely as only this will guarantee the success of the investigation. Thank you very much for your help! Please check boxes to answer the questions. CHECK THE BOX THAT INDICATES HOW TRUE EACH SENTENCE IS. I spoke in English during the group work. I spoke in Chinese during the group work. This task was a good learning experience. I could use what I learned from this task while communicating with native English speakers. I learned something about American culture performing this task. I understood what was expected of my group upon completion of this task. I had the skills to complete this task. I understood what was expected of me during groupwork on the task. My group had ample time to complete the task. My group completed everything in the task. I participated more than other group members during this task. My contribution to the group was meaningful. I was able to fully participate in this task. Using a dictionary or translator app during the task was useful. Having paper and a pen or pencil to take notes was useful. Mostly Very true/ Always/ 100% Yes true Somewhat true Not true/ Not at all/ No                                                                                                                 HOW OFTEN DID YOU USE THESE TOOLS DURING THE TASK? Didn’t use it at all. Used it very often. Dictionary or translator app Paper and pen/pencil to take notes                 II. answers properly. Finally, would you please answer a few personal questions – I will need this information to be able to interpret your Please provide: Your gender: □ Male □Female Your age (in years): COMMENTS: Do you have comments for me? If so, write any comments you want, in English or in Chinese, below. What did you think of the task? What did you like? What did you not like? Do you have suggestions for me on how to design this task better? THANK YOU VERY MUCH! 37 Appendix B: Early TV Task Early TV task2 1. Am I Ugly - PLEASE Be Honest: https://www.youtube.com/watch?v=AulY5Xoxp6Y 2. Am I pretty or ugley: https://www.youtube.com/watch?v=tLSFmYX45nc&list=PL0kmNXOFoXUvVtChVgRO1OlE MdckaYe8K 3. Out of Order - Cyanide & Happiness Shorts https://www.youtube.com/watch?v=D49vvl7BPro 4. Car Dent Repair With Hot Water And Toilet Plunger DIY https://www.youtube.com/watch?v=TybNUHyDAI4 5. EVERCLEAR BOTTLE SLAMMED IN 13 SECONDS !! https://www.youtube.com/watch?v=fWdOUaQq_Y4 6. How To Fix A Stuck Zipper - Tips for Life https://www.youtube.com/watch?v=lEOR3w2q0Yw 7. Super Mario Brothers Parkour In Real Life https://www.youtube.com/watch?v=lEOR3w2q0Yw 8. THE FLOOR IS LAVA!!! (Original Video) https://www.youtube.com/watch?v=k7XMuKwgb7A GET INTO GROUPS AND ANSWER THE FOLLOWING QUESTIONS. AFTER YOU HAVE FINISHED, YOU WILL PRESENT TO THE CLASS. 1. How does what you’ve seen so far compare to the TV shows we watched? 2. Who might be interested in these videos? What is the audience? 3. Why do you think these videos are so popular? 4. What are some advantages and disadvantages of this medium? 5. Do you think there should be more censorship in YouTube? • Four minute time limit. You will have 40 minutes to plan, rehearse, and perfect your presentation. • Everyone must speak. 2 In the pre-task phase, students watched 8 YouTube videos: 38 Appendix C: Late TV Task Work with your group to prepare a response to my presentation prompt. After you have finished preparing, you will present your ideas to the class. A.) • List some stereotypes you are familiar with. Do not use the examples from yesterday. • Are stereotypes useful? If so, provide example(s) • Are stereotypes harmful? If so, provide example(s) • How can stereotypes can result in discrimination? Provide example(s) B.) • What is an appropriate way to stereotype? • What is an inappropriate way to stereotype? • Synthesize your answers into a single moral law. • Four minutes maximum time limit. You will have 40 minutes to plan and rehearse your Restrictions and requirements presentation. • Everyone must speak. • You may read from notes while you present. • Provide an outline before 20 minutes have passed. 39 Appendix D: Music Task I will assign each group one song to give a presentation to the class about. Creedence Clearwater: https://www.youtube.com/watch?v=qV4Q-RSQCq0 Bruce Springsteen: https://www.youtube.com/watch?v=EPhWR4d3FJQ Rage Against The Machine: https://www.youtube.com/watch?v=8PaoLy7PHwk Green Day: https://www.youtube.com/watch?v=Ee_uujKuJMI Restrictions and requirements • You have 40 minutes to prepare. There is no “everyone must speak” requirement, not a time limit for presentations. • While presenting, you must also play the video for the class. • You must cite textual evidence. Your cited textual evidence should be written on the board or typed into a Word document for the class to view. • You may research the song online, but you must cite your sources during your presentation. • Every group must come up with three questions they have about the song’s lyrics or historical context and send them to me in the group chat (before 20 minutes have passed). If this is a protest song, what is it protesting? • Questions to address (in no particular order) • What was happening at the time this song was written and can that help explain the song’s message? • What kind of social change do you think the song is trying to produce? • What do you think of the song? Do you like it? Why or why not? • What’s one interesting thing you learned by exploring this song? 40 BIBLIOGRAPHY 41 BIBLIOGRAPHY Breen, M. P. (1984). Process syllabuses for the language classroom. In C. J. Brumfit (Ed.), General English syllabus design (pp. 47–60). Longon: Pergamon Press & The British Council. Burns, A. (2013). Qualitative Teacher Research. In C. A. Chapelle (Ed.), The Encyclopedia of Applied Linguistics. Blackwell Publishing Ltd. https://doi.org/10.1002/9781405198431.wbeal0986 Burns, A., & Westmacott, A. (2018). Teacher to Researcher: Reflections on a New Action Research Program for University EFL Teachers. Profile: Issues in Teachers´ Professional Development, 20(1), 15–23. https://doi.org/10.15446/profile.v20n1.66236 Byrnes, H. (2002). The role of task and task-based assessment in a content-oriented collegiate FL curriculum. Language Testing, 19, 425–433. Calvert, M., & Sheen, Y. (2015). Task-based language learning and teaching: An action-research study. Language Teaching Research, 19(2), 226–244. https://doi.org/10.1177/1362168814547037 Chan, S. P. (2012). Qualitative differences in novice teachers’ enactment of task-based language teaching in Hong kong primary classrooms. In A. Shehadeh & C. Coombe (Eds.), Task- Based Language Teaching in Foreign Contexts: Research and Implementation (pp. 187– 213). Amsterdam: John Benjamins. Dornyei, Z. (2009). The motivational basis of language learning tasks. In K. Van Den Branden, M. Bygate, & J. Norris (Eds.), Task-based language teaching: A reader (pp. 357–378). Amsterdam: John Benjamins. Egbert, J. (2006). A Study of Flow Theory in the Foreign Language Classroom. Canadian Modern Language Review/ La Revue Canadienne Des Langues Vivantes, 60, 549–586. https://doi.org/10.3138/cmlr.60.5.549 Ellis, R. (2003). Tassk-based language learning and teaching. Oxford: Oxford University Press. Erlam, R. (2016). “I”m still not sure what a task is’: Teachers designing language tasks. Language Teaching Research, 20(3), 279–299. https://doi.org/10.1177/1362168814566087 Field, A. (2009). Discovering Statistics Using SPSS (third). London: Sage. Foster, P. (2009). Task-based language learning research: expecting too much or too little? International Journal of Applied Linguistics, 19(3), 247–263. González-Lloret, M., & Nielson, K. B. (2015). Evaluating TBLT: The case of a task-based Spanish program. Language Teaching Research, 19(5), 525–549. https://doi.org/10.1177/1362168814541745 Harris, M. (2014). Improving Group Dynamics through Cooperative Learning Principles and Task-Based Approaches: Action Research Report. Sookmyung MA TESOL. 42 Huang, D. (2016). A Study on the Application of Task-based Language Teaching Method in a Comprehensive English Class in China. Journal of Language Teaching and Research, 7(1), 118–127. Larsen-Freeman, D. (2011). A complexity theory approach to second language development/acquisition. In Alternative approaches to second language acquisition (pp. 48–73). New York: Routledge. Leaver, B. Lou, & Wilis, J. R. (2004). Task-based instruction in foreign language education: Practices and programs. Washington, D.C.: Georgetown University Press. Li, M., & Eckhart, B. (2018). Summer Teaching Positions: English Communication Instructors to Teach at Huazhong Agricultural University in Wuhan (July 16 – August 3, 2018). Retrieved from http://nealrc.org/hzausecp/Announcement2-11-2018.pdf Loewen, S. (2015). Focus on Form. In Introduction to instructed second language acquisition (pp. 56–75). New York: routledge. Loewen, S. (2015). The acquisition of vocabulary. In Instructed Second Language Acquisition (pp. 95–114). New York: Routledge. Long, M. H. (2016). In Defense of Tasks and TBLT: Nonissues and Real Issues. Annual Review of Applied Linguistics, 36(2016), 5–33. https://doi.org/10.1017/S0267190515000057 Long, M. H., & Crookes, G. (1992). Three Approaches to Task-Based Syllabus Design. TESOL Quarterly, 26(1), 27. https://doi.org/10.2307/3587368 Mcdonough, K., & Chaikitmongkol, W. (2007). Teachers â€TM and Learners â€TM Reactions to a Task-Based EFL Course in Thailand Linked references are available on JSTOR for this article : Teachers â€TM and Learners â€TM Reactions to a Task-Based EFL Course in Thailand. TESOL Quarterly, 41(1), 107–132. Norris, J. M. (2016). Current Uses for Task-Based Language Assessment. Annual Review of Applied Linguistics, 36, 230–244. https://doi.org/10.1017/S0267190516000027 O’Connell, S. (2015). A Task-Based Language Teaching Approach to the Police Traffic Stop. TESL Canada Journal, 31(8), 116–131. https://doi.org/10.18806/tesl.v31i0.1189 Philp, J., & Duchesne, S. (2016). Exploring Engagement in Tasks in the Language Classroom. Annual Review of Applied Linguistics, 36, 50–72. https://doi.org/10.1017/S0267190515000094 Prabhu, N. (1987). Secod Language Pedagogy: a perspective. Oxford: Oxford University Press. Prabhu, N. S. (1984). Procedural syllabuses. In T. E. Read (Ed.), Trends in language syllabus design (pp. 272–280). Singapore: Singapore University Press/RELC. Révész, A. (2014). Towards a fuller assessment of cognitive models of task-based learning: Investigating task-generated cognitive demands and processes. Applied Linguistics, 35(1), 87–92. https://doi.org/10.1093/applin/amt039 Robinson, P. (2001). Task complexity, task difficulty, and task production: Exploring interactions in a componential framework. Applied Linguistics2, 22, 27–57. 43 Shehade, A. (2005). Task-based languaeg learning and teaching: Theories and applications. In C. Edwards & J. Willis (Eds.), Teachers Exploring Tasks in English Language Teaching (pp. 13–31). New York: Palgrave Macmillan. Van Den Branden, K. (2009). Mediating between predetermined order and chaos: The role of the teacher in task-based language education. International Journal of Applied Linguistics, 19(3), 264–285. https://doi.org/10.1111/j.1473-4192.2009.00241.x Van Den Branden, K. (2016). The Role of Teachers in Task-Based Language Education. Annual Review of Applied Linguistics, 36, 164–181. https://doi.org/10.1017/S0267190515000070 Van Gorp, K., & Bogaert, N. (2006). Developing language tasks for primary and secondary education. In M. H. Long & J. C. Richards (Eds.), Task-based language education: From theory to practice (pp. 76–105). Cambridge: Cambridge University Press. Willis, J. (1996). The TBL Framework. In A framework for task-based language teaching (pp. 52–65). Yasuda, S. (2011). Genre-based tasks in foreign language writing: Developing writers’ genre awareness, linguistic knowledge, and writing competence. Journal of Second Language Writing, 20(2), 111–133. https://doi.org/10.1016/j.jslw.2011.03.001 44