The iNACOL Standards: My Self-Assessment

Prompt: 

Use the following Likert Scale to assess your own feelings regarding attainment of Mastery of each individual iNACOL Standard:

0 = Not at All

1 = A Little

2 = An Average Amount

3 = More than Average

4 = Masterful

Look over your list and blog about your highest scoring standard and your lowest (if you have a tie with another standard, select only 1 to write about)

Post your thoughts on why you scored as high/low on these standards as you did. In particular, what is is about your lowest scoring standard that is needing more work/focus to attain mastery? What might you do moving forward to become more expertly proficient with that particular standard (the skills and understanding it represents)?

iNACOL Standard My Self-Rating
A 4
B 4
C 3
D 5
E 5
F 2
G 4
H 4
I 2
J 3
K 5

I gave my highest rating to the very last iNACOL Standard K, because I feel that I am an expert in arranging electronic media resources to support student learning (International Association for K-12 Online Learning, 2011). I conduct a lot of technology trainings for teachers in my district each week, and I have consistently received my most positive feedback for my training sessions that relate to media tools like YouTube, EDpuzzle, and online multimedia game apps like Kahoot and Quizizz. Even the other Education Technology Specialists with whom I work frequently send media-related questions to me.

On the other hand, the topic that I feel I have the most to learn about is how to make online learning more accessible to all learners, including those with special needs, which is represented by iNACOL Standard I (International Association for K-12 Online Learning, 2011). Even after several Brandman courses and many meetings and trainings with my district’s special education staff, I feel there is still much for me to learn about accessibility apps, devices, and teaching methods that help ensure equitable access for students with special needs. I am not alone in this deficit. I recently spend three full days at an intensive training on my school district’s newly adopted online English Language Arts curriculum. The training session was designed and executed by a visiting representative from the textbook publisher, and most of the agenda focused on teaching methods and how to use both the electronic and print resources that the teachers will begin using next year. Almost no time was spent discussing special-education students, until I interrupted the presentation a few times to make my own observations about ways in which teachers could cut and paste the resources into text-to-speech apps.

A major textbook company developing online versions of their big-ticket programs should not treat accessibility as an afterthought, and teachers should not have to jury-rig their own accessibility solutions for their students. Little things like subtitles on videos and push button access to screen readers should are not actually little things. For some of our learners, these options can make the difference between success and failure to learn. I’m sure I can do more to support teachers with this part of their job, and I look forward to attending more conference sessions and doing more research over the next several months to further develop my expertise in working with students with special needs.

References

International Association for K-12 Online Learning. (2011). National standards for quality online teaching. Retrieved from http://www.inacol.org/resource/inacol-national-standards-for-quality-online-teaching-v2/

Looking Back at Backwards Design

Screenshot of my completed Demo Unit
Screenshot of my completed Demo Unit.

Backwards Design was a very complicated process for me. For the past six years, I have worked in a school district that has very strict rules about lesson design, including a list of instructional norms and a very intricate common lesson design template. However, my district has not devoted much time to designing entire units.

Now that we have provided each student in our district with a Chromebook, and teachers are busy adapting their course curricula to take full advantage of technology, we have noticed that unit design will have to be a renewed area of emphasis for our district. My new boss, the incoming Assistant Superintendent of Educational Services, has made unit design a top area of priority for the coming school year. She has even mentioned the UBD framework as a possible focus of professional development for teachers next year. So my intricate experience with UBD lesson design over the past several courses here at Brandman have not only helped me design better units, they have also given me a leg up on my District’s near-term instructional goals.

My own learning journey with UBD has been very refreshing. I very much liked the emphasis on developing key learning objectives, including deciding how those objectives will be measured, prior to the design of any lessons or learning activities. It is too easy for teachers to fall into the trap of designing a lesson first, particularly when a teacher simply wants to design a lesson in order to try out a new technology tool. I often hear teachers ask me, for example, to help them create a lesson in Pear Deck, or some other tool. I always answer this sort of question with my own question: What is it you wish your students to learn? Teachers frequently need this prompt to back up their thinking and make sure that the technology tool really is the best choice for the desired learning outcome.

I am confident that I will use my Demo Unit in the future. I’m not sure it will be used in its complete form, but bits and pieces of it can certainly be used for various professional development experiences that my partners and I will be designing in the coming months. In fact, I have already used one of the resources I created for the Demo Unit, the YouTube Diner, as part of several recent presentations I have given at local technology conferences. This resource is a particularly good example of how my Demo Unit helps learners engage with content, because they must learn several specific concepts and skills related to embedding a YouTube video into a lesson for their own students.

My Demo Unit also has several examples of learners constructively engaging with peers, including the regular Threaded Discussion and Wiki posts. This is a skill that my district does not develop much up until now, because our professional development has traditionally been done in-person at faculty meetings, small-group work sessions, summer academies, etc. One of my bosses recently challenged us to start offering some of our training and professional development resources in an online format. My experience with this Demo Unit’s online discussions and virtual online meetings has helped me design a prototype professional training portal. A few weeks ago, my partners and I created our first training via this portal, a web-based conference designed to help teachers design Digital Citizenship lessons via Pear Deck. My partners and boss were very impressed that I was able to effectively lead a one-hour virtual online meeting with teachers. I never could have done this without my recent experiences in developing my Demo Unit.

Finally, my Demo Unit has several examples of engaging learners in the electronic environment, including the Key Summative Assessment New Google Sites Portfolio. This portion of the Demo Unit is, to be honest, one that I am not completely satisfied with. Whenever I expect adults to design an authentic product with a new technology tool, I like to provide a template for them to modify, so they don’t have to start a new product completely from scratch. However, there is, as yet, no way to share a template web site with others on New Google Sites. This functionality is available in our District’s G Suite domain with Classic Google Sites, but unfortunately Classic Google Sites is not nearly so user-friendly. Hopefully New Google Sites will add this capability in the future. In the meantime, I am still looking for a way to more effectively scaffold the creation of this portfolio.

Coming Soon: A Field Guide to Educational Apps

I have been creating classroom websites for my students since the old GeoCities days in the late 1990s. Since then, I’ve moved across several different platforms as both the available web-page platforms and my job placements have both changed. Most recently, I created a classroom web page using WordPress.org, although as I have left the high school classroom to pursue a full-time career as an education technology specialist, my website is now designed for adult learners (Wise, 2017b). For my current personal web page, I also purchased a domain name, which became very convenient when I became a Google for Education Certified Trainer last year. As a perk of being a Certified Trainer, I have access to my own private G Suite for Education domain, which is housed through my personal domain. This is one of the hidden treasures of being a Certified Trainer; whenever I want to learn about how Google for Education tools really work from the inside, I don’t need to ask the technicians at my district office; rather, I can simply poke around the Admin panel of my own Google domain and find out for myself.

For this week’s Blog post, however, I decided to give Weebly a try. I was very happy to find that Weebly, like WordPress.org, supports building a web site on one’s own private domain. Because of my Google work, this is an essential feature for any web page I would ever want to build in the future. Byrne (2011) recommended the Weebly platform because of its intuitive, yet powerful, interface, and I must agree that, six years later, Weebly is still very simple to use. The drag-and-drop functionality and simple page formatting are impressive, especially to a seasoned internet veteran like me who remembers the tedious process of uploading images via FTP and hand-coding web pages using raw HTML.

23038465070_9922f27a5f_b
Image Source: Mike Licht https://www.flickr.com/photos/notionscapital/23038465070

I would like to use this Weebly site to build a quick-reference web site that curates the most common web-based educational tools, apps, and extensions that teachers in my school district are currently using (Wise, 2017a). In my job, teachers frequently tell me about an idea they have for a lesson or learning activity, and they finish their question with the oft-repeated phrase, is there an app for that? Administrators, on the other hand, often tell me about an app or extension they saw a teacher using during a lesson, and want to know more about the app, what it can do, and whether it’s the best tool for whatever the students were learning during the lesson. I believe both questions can be answered by creating a web page with quick-reference information for the apps and extensions that are most in use by our teachers. I think of it as a Field Guide to Educational Apps.

Ideally, I would like to make this resource available to both teachers and administrators, so that people from either perspective will find useful information that encourages them to think about how technology tools may be used best to support student learning. In addition to short reviews for each app, I would like to embed reflective questions that encourage educators to think critically about how to choose the best app with regards to safety, security, functionality, cost, ease of use, and ability to foster deeper levels of knowledge. One thing this web site must have is an interface that renders well on all devices and screen sizes, from teachers working on a classroom desktop computer, all the way to administrators accessing the information on their smartphones.

I will be designing this web site over the summer vacation months, with the hope that the site will be completed in time for the beginning of the next school year in August–assuming, that is, I can find the time in my already jam-packed schedule of kids’ soccer games, hiking, and relaxing. This is summer, after all.

References

Byrne, R. (2011, February 7). 10 ways for teachers & students to build websites [Web log comment]. Retrieved from Free Technology for Teachers web site: http://www.freetech4teachers.com/2011/02/10-ways-for-teachers-and-students-to.html#.WT2szWjyvD6

Wise, B. (2017a). Field guide to educational apps. Retrieved from http://mrwisetech.weebly.com/

Wise, B. (2017b). Mr. Wise tech – a place for teachers using technology in the classroom. Retrieved from http://mrwisetech.com/

Give Your Web Site a Quick Accessibility Checkup

http://ncam.wgbh.org/invent_build/web_multimedia/tools-guidelines/favelet 

The NCAM Accessibility QA Favelet is an elegant, yet powerful, way to check a website for accessibility (NCAM, 2009). The Favelet works with a variety of browsers; even though it was first created in 2009, I found it worked quite well running on my Windows 10 version of Google Chrome, as long as I took a moment to allow permission to run the Javascript code. The Favelet checks for many common accessibility issues, including Alt tags on images and heading structures. The Favelet also shows you what your website looks like as unformatted text (see screenshots below).

Of course, when it comes to public school web pages, making content accessible to people with disabilities isn’t just a good idea, it’s the law. Section 508 requires public agencies to take steps to ensure that their online content is accessible to everyone (General Services Administration, n.d.). But accessibility should be more than just a compliance issue; educators should always look for opportunities to make their web-based content easy for everyone to understand.

screenshotFavelet2
Screenshot of the NCAM Accessibility QA Favelet’s image information analysis tool. Apparently I have a little work to do, because several images on my blog site don’t have descriptive Alt tags.
screenshotFavelet
Screenshot of my blog site as rendered by the NCAM Accessibility QA Favelet’s style removal analysis tool. 

References

General Services Administration. (n.d.) GSA government-wide section 508 accessibility program. Retrieved from https://www.section508.gov/

National Center for Accessible Media. (2009). NCAM Accessibility QA Favelet. Retrieved from http://ncam.wgbh.org/invent_build/web_multimedia/tools-guidelines/favelet

My Live Online Lesson: Lessons Learned

In my position as a technology specialist, I often participate in online meetings and webinars. However, I do not often lead online meetings, so this week’s live online lesson was still a bit of a new experience to me. I was very grateful for the opportunity to present to two trusted colleagues, rather than a live audience of strangers, because I still have much to learn before I consider myself an online teaching expert.

First, I found that my attention was often distracted with the unfamiliar controls of the online meeting app that I was using. I use a different online meeting app every week; in just the past month, for example, I have participated in a Google Hangout, an Adobe Connect meeting, a GoToMeeting, and a YouTube Live discussion. Each of these platforms has similar functionality, but the buttons and controls are all slightly different, and located in different places. The important lesson here is that an effective online teacher should select one platform for online trainings, stick with it, and use it often enough that students also become comfortable using it. In my district, we are currently shopping for a paid service to help us manage a series of webinar training programs we will be creating next year. Everyone recognizes the importance of selecting one common training platform; people are busy and do not want to spend a half hour learning a new platform every time they have to attend an online meeting.

Second, I should have taken the advice of Greene (2014), who admonished online instructors to speak more slowly than in everyday conversation. I tend to talk quickly, which is quite normal in the fast-paced administrative world, but when teaching students, time must be given for students to internalize what they are hearing, and connect it to whatever concepts are in their working memory. My colleagues were very polite, but I think I may have spoken a little too quickly and tried to squeeze too many concepts into a 10-minute presentation. If I were to ramble on like this for a full hour, many of my students might lose interest and start browsing other web sites on hidden tabs.

Finally, I noticed that my two students responded very positively to the live online quiz game at the end of my lesson. I selected Quizizz as the platform for this assessment because it is commonly used by K-12 teachers in my District, but that doesn’t mean that it can’t be used effectively for adult learners. Everyone likes to have a little entertainment with their assessment, after all; the soft music, colorful interface, and humorous feedback memes helped break the ice. Learning is difficult work, and teachers shouldn’t be afraid to take the occasional opportunity to have a little fun, as long as the focus on learning is not lost. In the case of my Quizizz game, my students completed a four-question quick check in about two minutes, which gave me a rapid insight into which concepts were well understood, and which one was still confusing to my class. This type of formative assessment is crucial  in the online learning environment, of course, because students are not physically present, so it can be difficult to read their facial expressions and other nonverbal cues.

Reference

Greene, K. (2014, March 19). Wk4_BigMarker_Online_Lesson [Video file]. Retrieved from https://youtu.be/_lF3-ox8AhA

The Importance of Feedback: Both Fast and Slow

Feedback is one of the most important parts of a teacher’s job. Even if they don’t always act like it, students tend to be very interested in teachers’ reactions to their work. The emotional course of a young person’s entire day can be strongly affected by the sort of feedback–both positive and negative–that a teacher may provide. The power of effective feedback, of course, doesn’t only impact a student’s affective domain; cognitive development also depends on frequent and specific feedback from teachers. As Wiggins (2012) pointed out, feedback based on formative assessment is one of the most powerful factors affecting student learning.

For adult learners, of course, feedback is no less important, but it needs to be structured a little differently. Adults, especially professional educators, often have a well-earned sense of their own expertise, which they may have developed over the course of many years or even decades of classroom experience with young people. Even the most personable and trustworthy administrator, instructional coach, or trainer might offer a well-stated and specific suggestion to a teacher, only to be rebuffed as a non-expert who doesn’t know what he or she is talking about. In short, providing critical feedback to adults is complicated.

Mochari (2014) described how Abraham Lincoln was masterful at providing feedback to his generals during the Civil War. After Lincoln’s death, unsent letters were discovered in Lincoln’s desk that he had wisely decided not to send. Writing these letters must have helped Lincoln clarify his thinking, and perhaps even vent some of his frustrations and anxieties about trying to keep the young nation together in spite of a brutal war. Among Mochari’s takeaways was the importance of putting yourself into someone else’s shoes before criticizing him or her (Mochari, para. 11). I once worked for a superintendent who developed a well-deserved reputation for fits of rage, during which he would yell at employees so loudly that others could hear his every word through the wall. Hearing my colleagues, some of whom had been educators for decades, being cursed at and belittled didn’t just affect their morale and self-esteem; it had negative effects on everyone within earshot. While this is a rather extreme example, the fact remains that supervisors must tread lightly when providing feedback.

Of course, treading lightly isn’t always possible. Swartz (n.d.) outlined a system of writing feedback she used with her online language arts students. Among her insights was a commitment to provide feedback via electronic comments within 24 hours of a student writing submission. Prompt feedback is key to learning because the learner needs to hear both positive reassurances and suggestions for improvement while their work is still fresh in their minds. Most people have had the dubious experience of receiving feedback so long after finishing a job that many of the decisions and actions associated with the work have been forgotten.

As I consider the feedback mechanisms in my Demo Unit for this course, I believe there are several good ways that the instructor can provide prompt and specific feedback to learners. Wiki posts and threaded online discussions, for example, give instructors an opportunity to participate in a discussion in real-time, certainly within the 24-hour constraint that Swartz developed. Virtual online meetings and video conferences allow for feedback that is even quicker, in that they approximate the face-to-face conversations that occur constantly in a classroom environment.

Although feedback is most effective when it is prompt, it is sometimes important for teachers to take at least a little more time to carefully reflect on which suggestion(s) will have the most impact on learning. Several of my Demo Unit assignments have rubric-based feedback, associated with formal grading tasks, which are somewhat slower than online meetings and simultaneous discussions. The advantage to slower feedback is that great, breakthrough ideas often require more time and reflection before they can be formulated. If there is one lesson that we can learn from Abraham Lincoln, it is that the slower, more deliberate approach may not always be the most popular choice, but it is sometimes the most effective.

References

Mochari, I. (2014, February 11). Abraham Lincoln’s brilliant method for handling setbacks. Retrieved from http://www.inc.com/ilan-mochari/lincoln-lesson-setbacks.html

Swartz, J. (n.d.). Strategies for providing substantive feedback in language arts in the online environment. Retrieved from http://itlab2.coe.wayne.edu/it6230/casestudies/english/english.html

Wiggins, G. (2012). Seven keys to effective feedback. Educational Leadership, 70(1), 10-16. Retrieved from http://www.ascd.org/publications/educational-leadership/sept12/vol70/num01/Seven-Keys-to-Effective-Feedback.aspx

The Nature of Our Learners: Prezi Reflection

For this week’s Blog post, I will be reviewing Elizabeth Neal’s (2017) presentation about the nature of our learners, and comparing and contrasting her thoughts and beliefs with mine. Neal’s presentation describes three specific beliefs about 21st Century learners, and makes connections to relevant iNACOL standards throughout (International Association for K-12 Online Learning, 2011). My presentation similarly concerns three of my core beliefs and makes reference to the same standards.

Both Neal and I listed a strong system of feedback as essential to understanding the nature of our learners, which both of us related to iNACOL Standard D. Her associated artifact, an Edutopia article about how to provide effective feedback to students, strongly supports and expands upon this belief. Although we both outlined that effective feedback can come from both students’ peers and teachers, she went a step further by mentioning the possibility of eliciting feedback from experts. When students are using technology to produce authentic products, the value of feedback from experts in the field is especially important to keep in mind. Teachers should endeavor to establish partnerships with community experts, who can provide valuable feedback by sitting in a presentation audience and/or providing written feedback of student work. Neal also overtly connected this belief in the importance of feedback to the accessing of students’ cognitive domains. Rather than making such connections to affective, behavioral, and cognitive domains, I instead made specific mention of components of my Demo Unit that support each of my three beliefs. Although we approached this assignment with this slight difference in perspective, I think both approaches resulted in a good analysis.

For Neal’s second belief, she outlined the importance of student motivation, which she related to iNACOL Standard A. I also mentioned student engagement in my presentation, although I linked it instead to iNACOL Standard B. Our beliefs about these two standards are somewhat similar; Neal focused on students being able to make choices in their learning, and provided a link to a journal article about fostering students’ skills in working independently. I chose instead to cite a resource about how technology tools can be used to promote student interest. Both of these connections are, I think, valid. Both are clearly connected to the affective domain of learning, although Neal made this connection overtly, while I did not do so.

Finally, Neal described the importance of students being actively engaged in their learning, which she related to iNACOL Standard C. She connected engagement strategies to the behavioral domain, which is a connection I would not have thought of. I tend to connect student engagement primarily to motivation and student affect, as I explained above. Her artifact is a very interesting video that illustrates the concept that 21st Century students make a very real contribution to the learning relationship because they have technological expertise that their teachers may lack (MacPherson Institute, 2015). Teachers are still, of course, content experts, but students are adept at using multiple technological tools to efficiently find and vet information.

My third belief, by contrast, concerned a completely different issue, that of using formative assessment to inform, modify, and improve instructional design. I related formative assessment to iNACOL Standard I. I think this is an especially important concept in blended and online instruction, because teachers cannot always rely on nonverbal cues and face-to-face conversation with students in order to assess whether or not they are understanding the desired learning objectives. In fact, it is sometimes challenging for the online teacher to even know whether or not the students are paying attention. Thus, the importance of using frequent, multilayered formative assessment looms large in 21st Century learning. For this final argument, I selected a very practical artifact in the form of a blog post that outlines several dozen tools and apps that teachers can use to make different types of formative assessments.

References

International Association for K-12 Online Learning. (2011). National standards for quality online teaching. Retrieved from http://www.inacol.org/resource/inacol-national-standards-for-quality-online-teaching-v2/

Neal, E. (2017, May 11). The nature of our learners [Prezi slides]. Retrieved from http://prezi.com/sdvtycdt8n9g/?utm_campaign=share&utm_medium=copy&rc=ex0share

MacPherson Institute. (2015, October 26). Peter Felten on engaging students as partners in learning and teaching [Video file]. Retrieved from https://youtu.be/pPU4ckBBeEU

Refining the Demo Unit: Strengths and Areas for Growth So Far

My Demo Unit contains many elements that strongly address some of the iNACOL Standards (International Association for K-12 Online Learning, 2011). Three standards in particular stand out as particular areas of strength in my Demo Unit:

iNACOL Standard D

The threaded discussion and summative assessment rubrics provide some clear expectations for how my learners need to demonstrate their learning. At the same time, the rubrics were carefully designed to accommodate a wide variety of project formats and technology tools, so learners still have plenty of freedom about how to design their learning. For example, the summative portfolio must include a minimum of 3 artifacts and a reflective essay, but the format of the artifacts can be anything including a text document, slideshow, image, video, or URL link to some other online resource the learner has designed.

iNACOL Standard J

Communication with colleagues and stakeholders is an important part of my job, so I feel this is a real strength in my Demo Unit. The sample letters I included in my Hidden Instructor Resources folder are just small samples of my everyday interactions with technology tool vendors, colleagues in other districts, parents, students, etc.

iNACOL Standard F

Many of my weekly lesson design tasks are very flexible. These open-ended assignments create rich opportunities for accommodation because learners have a lot of choices to make in how to represent their lesson design tasks. For example, the weekly assignments may be submitted as either text or slideshow documents. My Demo Course also includes several screencast video explanations with embedded subtitles, which provides some support for learners with disabilities and/or different learning styles.


Of course, the work of a thoughtful designer is never really complete, and I see several areas in which my Demo Unit still needs revision in order to meet some iNACOL Standards:

iNACOL Standard B

Since the whole point of my Demo Unit is to teach a variety of media tools, I’m concerned that the curriculum may have too many new technology tools squeezed into the original seven-week time frame. Perhaps it would be better to spread this number of technology tools out over an entire school year of professional development meetings. Maybe each lesson should really happen over the course of a month, rather than a week, since many of my learners are employed as full-time classroom teachers. Also, the Demo Unit does not include any lesson that encourages learners to connect with outside colleagues via Edmodo, Twitter Chat, or similar networking activity designed to support a community of practice. In order to address this shortcoming, I will be replacing one of the redundant assignments in Week 6 with a social media assignment.

iNACOL Standard G

I feel that my assessments are very strong overall, especially the summative ones. However, there are not sufficient formative assessments upon which to make adjustments in the course, especially in the early lessons. I will be adding a formative assessment component that checks the learners for basic conceptual understanding after the second or third lesson of the Demo Unit. I will probably design an easy- to medium-level difficulty test of basic knowledge of SAMR, HyperDocs, and using YouTube for education. In keeping with the Demo Unit’s main theme of media tools, I will probably structure the quiz in a way that incorporates embedded graphics and video clips via Socrative or similar online assessment platform.

iNACOL Standard I

While I am designing the quiz mentioned above, I will also include one or more questions that solicit feedback on how the course is going, including suggestions for how the remaining lessons might be better explained or modified to meet learners’ needs, so that the early formative assessment may be used to make adjustments to the course in real time.

 

Reference
International Association for K-12 Online Learning. (2011). National standards for quality online teaching. Retrieved from http://www.inacol.org/resource/inacol-national-standards-for-quality-online-teaching-v2/

Authenticity In the Exciting World of Education Technology

The design of my Demo Unit was an authentic experience because I decided to make a course for the adult professional educators whom I work with every day as part of my job. It would have been much easier for me to build a Demo Unit full of lesson resources that I have used in the past with my high school science students, but that would not have been an authentic learning experience for me, because I no longer teach high school students. In short, I decided to look forward to the future, building learning activities that I might use next year, rather than looking backward.

I think it’s important to look ahead when designing curriculum. Too often, I see teachers design activities that are primarily designed to review prior knowledge. I’m not sure this is a great way to build student motivation for learning. Especially at the secondary level, teachers often precede a key exam or quiz with one or more review days. These lessons typically include vocabulary games, Jeopardy!-style competitions, and teacher-led rehashing of previously learned (or, perhaps, previously not-learned) concepts. While many of these activities are designed with good intentions, and may help remind students what they should remember for their upcoming test, I wonder if they are very effective. When students hear that today is a review day, they may feel entitled to turn off their brains to some extent, because their teacher is basically telegraphing that nothing new will be learned today. Ideally, however, something new should be learned every day!

So I am tempted to blow up the idea of the review day. In fact, maybe we should even blow up the idea of a test or quiz as a primary summative assessment altogether. Imagine how much more engaged a student might be in a one- or two-day summative project that would require some independent research and application of key concepts to practical problem-solving. Rather than give my students a multiple-choice quiz about atomic structure, for example, why not assign them to prepare multimedia presentations about how atomic structure relates to everyday teenage problems, like crack-resistant cell phone screens or artificial sweeteners? After all, I typically put my chemistry students to sleep with even the best-designed lessons about electron orbitals, but if I taught them how those orbitals related to, say, the interaction of a drug molecule with the human brain, suddenly their interest level would increase dramatically.

Upon final reflection about our Demo Unit assignment at Brandman, one modification that could have made the learning much more authentic would have been allowing some flexibility about the platform used. Although Blackboard and CourseSites are still very good ways to structure online learning units, there are newer platforms with more reliable functionality, like Canvas. In my specific case, because I work in a district that has a 1:1 Chromebook implementation, it would have made more sense for me to structure my Demo Unit using Google Sites, Classroom, Hangouts, and other Google Apps. Of course, I can appreciate the difficulty in modifying course content to stay up to date with the latest developments in education technology.

In fact, I can foresee a time when something else may replace even Google as the predominant technology platform in my district. Both Apple and Microsoft, for example, are developing their own classroom technology environments. I wouldn’t be surprised at all to see my colleagues and their students using OneDrive and/or Microsoft Classroom just a few years from now. Just don’t tell them I said that! It can be daunting enough for educators just to keep up with the amazing variety of apps and web tools that are available right now. Considering how rapidly education technology may evolve over the next five years, I think we have much to be excited–and frightened–about.

Navigating UDL and Technology In the 21st Century Classroom

In a world where Universal Design for Learning (UDL) is the standard for learning is structured, there should be fewer missed opportunities to engage students. Obviously this is important so we may use modern technology to its full advantage in order to level the playing field for students with disabilities and special learning needs. If I were to encounter an administrator who was hesitant to invest funds in staff development to support UDL, I’d point out that universal design also supports students who, for a variety of reasons, don’t tend to engage well in traditional teacher-led, textbook-based learning. These reasons can include unique preferred learning styles, varying personalities, and barriers presented by differences in culture, home language, and/or socioeconomic background. Even simple technology features like the ability to pause and replay a YouTube video, or to enable subtitles with a simple mouse click, can make a tremendous difference for a student with any of the challenges listed above.

Of course, faithfully implementing UDL in all classrooms is a tall order. In addition to the obvious challenges of limited time and funding, there is also an attitude among many educators that technology is contributing to an outbreak of increased cheating, including plagiarism, students sharing answers, and erosion of the security of high-stakes tests. To teachers who are nervous about such threats to the validity of assessment, I would offer the glimmer of hope that technology may help us detect plagiarism much more quickly and objectively via such checking apps as LiveText and Turnitin. 

There are even more powerful tools for detecting dishonesty coming very soon. My school district, for example, has been evaluating a new form of software that allows teachers to access their students’ active browser tabs and URL histories, and even to partially control the active windows of their devices. While these functionalities give teachers a lot of exciting new ways to streamline classroom instruction, they also open the door to an unprecedented level of access to what students choose to read and watch, including what they do with their devices during non-school hours. I am concerned about potential overreach that is possible with these technologies. I recently participated in a webinar in which privacy expert Amelia Vance pointed out that all of our new and powerful forms of electronic tracking of students may contribute to a so-called surveillance effect, which may inhibit the formation of the trust between students and teachers (A. Vance, personal communication, April 5, 2017). I would hope that teachers, administrators, and parents can work together to create a new set of norms that will help us optimize the environment for student learning in ways that are equitable and motivating, while at the same time monitoring our students well enough to ensure their safety and honesty.

Perhaps the most important thing teachers can do in order to support these efforts is to design learning activities that take full advantage of the technologies currently available to their students. The SAMR conceptual framework, although far from perfect, gives teachers a useful tool they may use to evaluate the quality of technology use in their lesson designs (Dunn, 2013). My Demo Course incorporates a brief SAMR that asks teachers to reflect on which level a lesson utilizes–substitution, augmentation, modification, or re-defintion (Dunn, 2013). This tool reminds me of the Bloom’s taxonomy that I learned two decades ago when I was starting my career as a classroom teacher. In fact, Bloom’s taxonomy itself has been updated to make more explicit connections to specific measurable learning outcomes (Iowa State University of Science and Technology, 2017). I think it’s useful for teachers to consider, as they write the learning objective of a lesson, which Bloom’s level and/or SAMR category the lesson fits into. The bottom line is, it’s more difficult for students to cheat when we ask them to answer questions and solve problems that don’t have a simple, single correct answer.

References

Dunn, J. (2013). New pedagogy wheel helps you integrate technology using SAMR model. Retrieved from Edudemic web site: http://www.edudemic.com/new-padagogy-wheel-helps-you-integrate-technology-using-samr-model/

Iowa State University of Science and Technology. (2013). Revised Bloom’s taxonomy. Retrieved from http://www.celt.iastate.edu/teaching/effective-teaching-practices/revised-blooms-taxonomy