When I wrote my first WOL post, I placed myself between the Early Adopter and Early Majority categories. I’ve always been curious and willing to try new tools, but I also like to understand their purpose before committing. After a semester studying emergent technologies, I realized that being an early adopter is more complex than I initially thought.
This course pushed me to see technology beyond its features. I began thinking about who it supports, who it leaves out, and how design choices affect real learners. Concepts like accessibility, ethics, and user experience became central to how I evaluated new tools. I learned that innovation isn’t just about excitement. It requires discernment and an awareness of the responsibility that comes with choosing and implementing technology.
Where I Am on the Curve Now
At the start of the term, I confidently claimed the space between Early Adopter and Early Majority. Now, as I reflect on everything we’ve analyzed, debated, and experimented with, I find myself leaning slightly more toward the Early Majority than I anticipated. I still love exploring new tools such as AI assistants, creative platforms, formative analytics dashboards, microlearning apps, but my approach is more intentional now. The simple truth is that the more I’ve learned, the more I’ve realized how much I don’t want to adopt technologies blindly.
Instead, I want to adopt them ethically, equitably, and in alignment with learner well-being.
This shift doesn’t feel like a regression; it feels like maturing into a learning designer who understands the weight of these tools. Innovation isn’t just shiny; it has social, emotional, and ethical consequences for real people. And that awareness shapes how I see my role going forward.
How This Shapes My Approach to Future Technology
Understanding where I sit on the innovation curve now directly influences how I plan to analyze and implement technologies in my career. Instead of asking, “Is this new?” or “Is this exciting?” my future questions will sound more like:
Does this tool enhance accessibility and equity?
Does it respect learner autonomy, data, and dignity?
How does this technology change the role of the instructor or learner?
Is it actually solving a problem or simply adding complexity?
Who benefits and might be excluded?
In my experience, exploring technologies like AI has shown me how quickly systems can shift from helpful to problematic depending on how they’re deployed. Learning analytics demonstrated the potential for early intervention and personalized support, but also raised questions about privacy and surveillance. Microlearning showed how seamlessly learning can fit into everyday life, but also revealed how easily content can become fragmented or oversimplified if not handled well.
Moving forward, I plan to approach innovation as both a designer and an advocate. I want to embrace tools that genuinely improve learning outcomes, reduce barriers, personalize instruction, or create new pathways into knowledge, but I also want to challenge technologies that perpetuate inequity or prioritize efficiency over humanity.
Rather than rushing to adopt, I want to understand, evaluate, and then integrate with purpose.
Ethical and Social Considerations
If there was one theme that shaped my movement on the curve, it was ethics. The speed of AI development is unlike anything I’ve ever witnessed, and while it opens extraordinary opportunities, it also raises profound questions about bias, privacy, labor, authorship, and the future of human creativity.
AI systems can personalize learning and offer immediate support, but they can also flatten nuance, reinforce stereotypes, and create dependencies if used carelessly. Learning analytics can help identify learners who need help, but they can also become tools of surveillance if institutions prioritize monitoring over empowerment. Microlearning can make content beautifully accessible, but it can also oversimplify complex ideas if used without intention.
One ethical dilemma that deeply resonated with me this semester was the role of AI in learner data collection. The idea that systems can track behavior, predict performance, or flag “at-risk” students before they even realize they’re slipping is powerful, but it also invites questions:
Who has access to this data?
How is it interpreted?
Can these predictions become self-fulfilling?
What biases exist in the model?
How do we ensure transparency and consent?
These questions pushed me slightly away from my earlier early-adopter stance. Not because I distrust technology, if anything, I respect it more now. My thought process is that I take its implications more seriously knowing how powerful it can be. I understand now that emerging technologies don’t simply require skill; they require judgment. In this sense, ethics becomes not an obstacle to innovation but a compass for it.
What This Means for My Future as a Learning Designer
Reevaluating the curve helped me better understand the kind of learning designer I want to become. I want to use emerging technologies in ways that genuinely support learners rather than overwhelm them. Some people see new tools as distractions or barriers, but I see them as opportunities to expand human capability when used intentionally. My goal is to help create learning environments where technology enhances accessibility, creativity, and understanding, not just efficiency. I want to use AI to support meaningful learning, analytics to guide rather than monitor, and microlearning to offer flexibility without losing depth.
This shift toward the Early Majority does not mean I am hesitant about innovation. It means I am taking it seriously. I want to approach new tools with a focus on problem solving and real human needs. In my future work, I hope to translate emerging technologies into clear, practical solutions that improve people’s experiences and help them succeed. Technology has incredible potential to make learning more equitable and empowering. I want to be part of designing it in a way that truly improves lives.
Conclusion
Looking back at the beginning of the semester, I am still optimistic about what innovation and emerging technologies could do. In addition, now I also see the designer who understands that innovation must be paired with wisdom. In the end, the curve isn’t just about how early we adopt, it’s about how responsibly we try to do it.
Thank you for reading,
Jeanie :)




