- Reset all ×
- Poster Presentation ×
- The Chinese University of Hong Kong ×
- The Hong Kong University of Science and Technology ×
- The Hong Kong Polytechnic University ×
- Tung Wah College ×
- Yew Chung College of Early Childhood Education ×
- 1. Showcase Project Achievements ×
- 2. Thematic Exploration ×
- 1.3 Special UGC Grant for Strategic Development of Virtual Teaching and Learning (VTL) ×
- 2.1 Community of Practice (CoP) ×
Filter Presentations
4 posts found
Poster Presentation Time: 1225-1400; 1500-1600
Venue: H2, Tai Po-Shek-O Room, Lower Level I
Presenter(s)
– Dr Pauli LAI, Lecturer, Department of Electrical and Electronic Engineering, The Hong Kong Polytechnic University
– Dr Julia CHEN, Director, Educational Development Centre, The Hong Kong Polytechnic University
Abstract
A common assessment in university is the oral presentation, and students are often required to deliver presentations in English. Two challenges arise. First, many students mainly focus on the discipline content in the assessment preparation process rather than the communication or use of English in their presentations. Second, lecturers of large classes (e.g. around 200 engineering students in one course) hardly have time to give feedback to each student on the English communication aspect of their oral presentations. A baseline survey reveals students’ need for assistance with presentation skills and a hope for having AI-generated feedback among both students and discipline teachers. To address these needs and hope, a team of educators from PolyU and BU with expertise in language and AI technology collaboratively developed an online English oral presentation platform called SmartPresenter. SmartPresenter provides students with presentation tips, learning materials, and extensive AI-generated feedback on the communication-related aspects of delivering oral presentations in English, including eye contact, facial expressions, vocal fillers, pronunciation, and fluency. This presentation describes the development and features of SmartPresenter, and the evaluation results of the effectiveness of the platform in facilitating independent learning practices for English oral presentations and assisting teachers in grading presentation assessment.
Theme: 1: Showcase Project Achievements
Sub-theme: 1.3Â Special UGC Grant for Strategic Development of Virtual Teaching and Learning (VTL)
Poster Presentation Time: 1500-1600; 1700-1800
Venue: L6, Tai Po-Shek-O Room, Lower Level I
Presenter(s)
– Professor Gladys Wai Lan TANG, Centre Director, Centre for Sign Linguistics and Deaf Studies, The Chinese University of Hong Kong – Mr Jafi YF LEE, Research Associate, Centre for Sign Linguistics and Deaf Studies, The Chinese University of Hong Kong – Dr Chris KM YIU, Senior Programme Officer, Centre for Sign Linguistics and Deaf Studies, The Chinese University of Hong Kong
Abstract
The Centre for Sign Linguistics and Deaf Studies is building a Community of Practice to support deaf and hard-of-hearing (d/hh) students pursuing tertiary education. The d/hh students face different barriers to information accessibility because of their diverse backgrounds and learning needs. Possible types of educational support including captioning & AI summaries, subtitles for videos, note-taking/stenography, wireless transmission system, sign interpretation, and other accommodations should be explored to address their respective needs. The project will lead to 1) an improved understanding of the learning needs of the d/hh students, 2) a raised awareness of the physical learning environment/hardware accommodations, 3) the development of new teaching strategies and practices, 4) the deployment of new tools and aids, and 5) the design of a support system with accommodation for in-class and course-end assessments. The project will host seminars to facilitate the dissemination of effective strategies for supporting d/hh students among members of the CoP. The ultimate goal is to recommend an effective and operable support system to EDB and UGC.
Theme: 2. Thematic Exploration
Sub-theme: 2.1 Community of Practice (CoP)
Poster Presentation Time: 1225-1400; 1500-1600
Venue: H4, Tai Po-Shek-O Room, Lower Level I
Presenter(s)
– Mr Isaac Ka Chun WAN, Instructional Designer, Centre for Education Innovation, The Hong Kong University of Science and Technology
Abstract
The use of videos in asynchronous learning significantly enhances the educational experience, especially for intricate or abstract concepts. Videos allow students to adapt their learning pace, fostering a more flexible and personalized process. However, traditional video lectures often promote passive learning, making it hard for instructors to monitor students’ progress effectively. To address these challenges and help instructors create an interactive video-based learning environment, a strategic workflow has been developed. This workflow incorporates two customized digital tools that facilitate the creation of engaging video elements and provide detailed analytics on student engagement and progress. As a result, students are empowered in their asynchronous learning journey.
Theme: 1. Showcase Project Achievements
Sub-theme: 1.3Â Special UGC Grant for Strategic Development of Virtual Teaching and Learning (VTL)
Poster Presentation Time: 1500-1600; 1700-1800
Venue: G1, Tai Po-Shek-O Room, Lower Level I
Presenter(s)
– Dr Yuk Ming TANG, Senior Lecturer, Department of Indusial and System Engineering, The Hong Kong Polytechnic University
Abstract
STEM education is essential in today’s curriculum even for university students. However, traditional classroom-based instruction methods often lack interactivity and tailored experiences that foster student engagement and comprehension. The integration of Virtual Reality (VR) and Artificial Intelligence (AI) generative chatbots has emerged as a transformative influence on the teaching and learning process. Despite this, limited research has explored the impact of advanced technology on STEM learning outcomes. This study explores the potential of employing VR and AI as tools to facilitate teaching to enhance students’ learning outcomes. 120 university students are involved in this study to examine the difference in learning outcomes by utilizing three instructional approaches for learning projectile motion: (1) a traditional didactic classroom, (2) a game-based VR metaverse, and (3) a game-based VR metaverse enriched with a generative chatbot-based pedagogical agent. The study prudently evaluated alterations in student motivation, cognitive benefit, and learning outcomes. Preliminary findings suggest that incorporating VR and AI into teaching considerably enhances student engagement and cognitive participation. This study demonstrates how the integration of VR with AI can elevate student engagement, comprehension, and skill acquisition in STEM education, paving the way for a more captivating and effective learning environment in the Edu-metaverse.