Week 15: AJET Special Issue: Emerging Tech in Ed for Innovative Pedagogy and Competency Development (2021), 37(5),
Article 1: Students’ Competencies
Discovery and Assessment Using Learning Analytics and Semantic Web
Halimi and Seridi-Bouchelaghem
(2023) explore how learning analytics and semantic web technologies can be
utilized to discover and assess students' competencies. This study particularly
resonates with me because it challenges the traditional assessment approaches,
advocating for more dynamic and data-driven methods. The integration of
semantic web technologies to map and assess competencies is innovative and
forward-thinking, as it allows for real-time analysis of student progress and
skill acquisition.
One intriguing concept is the use
of ontology-based competency models. This approach facilitates the creation of
detailed profiles that can evolve as students engage with different learning
materials. I find this particularly fascinating because it shifts assessment
from static testing to continuous monitoring, which aligns well with formative
assessment practices.
However, I do have reservations
about the complexity of implementing such a system. While the model is
theoretically robust, practical challenges such as data privacy, system
integration, and faculty training are not fully addressed. Additionally, I wonder
how educators can balance the technical demands of implementing semantic web
technologies with their primary teaching responsibilities.
What I find most inspiring is the potential for personalized learning paths based on real-time data analysis. This could significantly enhance student engagement by offering tailored feedback and guidance. I believe that further research should explore how to simplify the implementation process to make it more accessible to educators with limited technical expertise.
Article 2: Applying Natural
Language Processing to Automatically Assess Student Conceptual Understanding
from Textual Responses
Somers et al. (2023) delve into the
application of natural language processing (NLP) to evaluate students'
conceptual understanding from their written responses. This approach resonates
strongly with my interest in automating formative assessment, as it demonstrates
how technology can reduce the workload for educators while maintaining robust
assessment practices.
What intrigued me the most is the
model's ability to parse complex textual data and accurately interpret the
depth of students' understanding. Instead of relying solely on keywords or
surface-level analysis, the model considers the context and structure of
responses. This human-like processing capacity is particularly promising for
subjects requiring critical thinking and analysis, such as science and
humanities.
However, I am skeptical about the
model's ability to fully capture nuanced and context-specific responses. While
NLP is advanced, it still struggles with interpreting idiomatic expressions or
culturally specific references. This limitation raises concerns about its
application in diverse classrooms, where language variation may lead to
inaccurate assessments.
I believe that further enhancement of these models should focus on increasing cultural and linguistic adaptability. Developing a database that includes diverse linguistic patterns could make the tool more universally applicable. As someone interested in using technology to support diverse learners, I see this as an important next step.
Article 3: Faculty Readiness for a
Digital Education Model: A Self-Assessment from Health Sciences Educators
Olivares et al. (2023) examine
health sciences educators’ readiness for adopting a digital education model.
This topic resonates with me as it highlights the gap between technological
innovation and practical application in higher education. The study revealed
that while faculty members generally recognize the importance of digital
education, their preparedness varies significantly, particularly regarding
technical skills and pedagogical integration.
One point that I strongly agree
with is the need for targeted professional development. As the authors point
out, many educators feel overwhelmed by digital tools due to inadequate
training. This reflects my own experiences where training sessions were often
too generic, failing to address discipline-specific challenges.
A surprising finding is the
apparent disconnect between perceived and actual readiness. Many faculty
members rate their digital competencies higher than objectively measured. This
gap suggests a need for more accurate self-assessment tools and peer mentoring
to foster genuine skill development.
What I find intriguing is the emphasis on continuous professional development rather than one-time training sessions. This aligns with the reality that digital education is continuously evolving, requiring educators to stay updated with emerging technologies. I would advocate for integrating professional learning communities where faculty can collaboratively explore digital tools and share best practices.
Reflections and Future Directions
These three studies collectively
highlight the transformative potential of emerging technologies in education.
While leveraging data analytics and NLP can significantly enhance student
assessment, successful implementation hinges on faculty readiness and
continuous professional development. I am particularly drawn to the idea of
integrating AI-driven assessment tools with ongoing faculty training to bridge
the gap between innovation and practice.
In my future educational practice,
I plan to explore how to simplify the use of semantic web technologies and NLP
tools, making them more accessible to instructors who may lack technical
expertise. Additionally, I want to investigate the development of adaptive
training programs that address specific challenges identified in readiness
assessments.
One open question remains: How can
we ensure that these technologies are not only accurate but also equitable in
diverse educational settings? I believe future research should focus on
developing culturally responsive algorithms and exploring ways to support
educators in culturally diverse contexts.
References
Halimi, K.,
& Seridi-Bouchelaghem, H. (2023). Students’ competencies discovery and
assessment using learning analytics and semantic web. Journal of Educational
Technology and Assessment, 77–97.
Somers, R.,
Cunningham-Nelson, S., & Boles, W. (2023). Applying natural language
processing to automatically assess student conceptual understanding from
textual responses. Journal of Educational Technology and Assessment,
98–115.
Olivares,
S. L., Lopez, M., Martinez, R., Alvarez, J. P. N., & Valdez-García, J. E.
(2023). Faculty readiness for a digital education model: A self-assessment from
health sciences educators. Journal of Educational Technology and Assessment,
116–127.
Fidelis, I appreciate your thoughts and insights from the readings! I had an a-ha moment when you wrote this, "The study revealed that while faculty members generally recognize the importance of digital education, their preparedness varies significantly, particularly regarding technical skills and pedagogical integration." I'm sure many of us in the IST program can relate, but I have been a little surprised how some professors are adept with technological tools, giving us space to explore and create. And then there are others, still requiring dated reading materials and basic Canvas discussion boards. It is surprising to me that in a program designed to teach us to effectively use technology for learning, there is a significant variance in professor ability.
ReplyDeleteI just want to say THANK YOU for your thoughtful comments and feedback to my ongoing blog posts. I've appreciated learning from and with you this semester. All the best!