ILDE content created during the OU workshop

Σχετικά έγγραφα
Saint Thomas the Apostle Catholic Academy September 20, 2017

ΑΓΓΛΙΚΗ ΓΛΩΣΣΑ ΣΕ ΕΙΔΙΚΑ ΘΕΜΑΤΑ ΔΙΕΘΝΩΝ ΣΧΕΣΕΩΝ & ΟΙΚΟΝΟΜΙΑΣ

Code Breaker. TEACHER s NOTES

Information and Communication Technologies in Education

Section 1: Listening and responding. Presenter: Niki Farfara MGTAV VCE Seminar 7 August 2016

Από τις Κοινότητες Πρακτικής στις Κοινότητες Μάθησης

«ΕΠΙΔΙΩΚΟΝΤΑΣ ΤΗΝ ΑΡΙΣΤΕΙΑ ΣΤΗΝ ΚΙΝΗΤΙΚΟΤΗΤΑ ERASMUS» 29 ΝΟΕΜΒΡΙΟΥ 2013

2016 IEEE/ACM International Conference on Mobile Software Engineering and Systems

Modern Greek Extension

Instruction Execution Times

HOMEWORK 4 = G. In order to plot the stress versus the stretch we define a normalized stretch:

2 Composition. Invertible Mappings

Business English. Ενότητα # 2: Management. Ευαγγελία Κουτσογιάννη Τμήμα Διοίκησης Επιχειρήσεων

ΑΓΓΛΙΚΑ Ι. Ενότητα 7α: Impact of the Internet on Economic Education. Ζωή Κανταρίδου Τμήμα Εφαρμοσμένης Πληροφορικής

TaxiCounter Android App. Περδίκης Ανδρέας ME10069

Test Data Management in Practice

Démographie spatiale/spatial Demography

Assalamu `alaikum wr. wb.

ΙΠΛΩΜΑΤΙΚΗ ΕΡΓΑΣΙΑ. ΘΕΜΑ: «ιερεύνηση της σχέσης µεταξύ φωνηµικής επίγνωσης και ορθογραφικής δεξιότητας σε παιδιά προσχολικής ηλικίας»

1) Abstract (To be organized as: background, aim, workpackages, expected results) (300 words max) Το όριο λέξεων θα είναι ελαστικό.

Πανεπιστήμιο Πειραιώς Τμήμα Πληροφορικής Πρόγραμμα Μεταπτυχιακών Σπουδών «Πληροφορική»

Συστήματα Διαχείρισης Βάσεων Δεδομένων

ίκτυο προστασίας για τα Ελληνικά αγροτικά και οικόσιτα ζώα on.net e-foundatio // itute: toring Insti SAVE-Monit

Γιπλυμαηική Δπγαζία. «Ανθπυποκενηπικόρ ζσεδιαζμόρ γέθςπαρ πλοίος» Φοςζιάνηρ Αθανάζιορ. Δπιβλέπυν Καθηγηηήρ: Νηθφιανο Π. Βεληίθνο

ΠΑΝΕΠΙΣΤΗΜΙΟ ΠΑΤΡΩΝ ΠΟΛΥΤΕΧΝΙΚΗ ΣΧΟΛΗ ΤΜΗΜΑ ΜΗΧΑΝΙΚΩΝ Η/Υ & ΠΛΗΡΟΦΟΡΙΚΗΣ. του Γεράσιμου Τουλιάτου ΑΜ: 697

GREECE BULGARIA 6 th JOINT MONITORING

ΓΕΩΠΟΝΙΚΟ ΠΑΝΕΠΙΣΤΗΜΙΟ ΑΘΗΝΩΝ ΤΜΗΜΑ ΕΠΙΣΤΗΜΗΣ ΤΡΟΦΙΜΩΝ ΚΑΙ ΔΙΑΤΡΟΦΗΣ ΤΟΥ ΑΝΘΡΩΠΟΥ

Εκδηλώσεις Συλλόγων. La page du francais. Τα γλωσσοψυχο -παιδαγωγικά. Εξετάσεις PTE Δεκεμβρίου 2013

ΜΕΤΑΠΤΥΧΙΑΚΗ ΔΙΠΛΩΜΑΤΙΚΗ ΕΡΓΑΣΙΑ «ΘΕΜΑ»

ΦΥΛΛΟ ΕΡΓΑΣΙΑΣ Α. Διαβάστε τις ειδήσεις και εν συνεχεία σημειώστε. Οπτική γωνία είδησης 1:.

ΤΕΧΝΟΛΟΓΙΚΟ ΠΑΝΕΠΙΣΤΗΜΙΟ ΚΥΠΡΟΥ ΤΜΗΜΑ ΝΟΣΗΛΕΥΤΙΚΗΣ

derivation of the Laplacian from rectangular to spherical coordinates

Business English. Ενότητα # 9: Financial Planning. Ευαγγελία Κουτσογιάννη Τμήμα Διοίκησης Επιχειρήσεων

Επίλυση Προβλήματος σε Προγραμματιστικό Περιβάλλον από Παιδιά Προσχολικής Ηλικίας

ΠΑΝΔΠΙΣΗΜΙΟ ΜΑΚΔΓΟΝΙΑ ΠΡΟΓΡΑΜΜΑ ΜΔΣΑΠΣΤΥΙΑΚΧΝ ΠΟΤΓΧΝ ΣΜΗΜΑΣΟ ΔΦΑΡΜΟΜΔΝΗ ΠΛΗΡΟΦΟΡΙΚΗ

Advanced Subsidiary Unit 1: Understanding and Written Response

EE512: Error Control Coding

TMA4115 Matematikk 3

ΑΓΓΛΙΚΑ IV. Ενότητα 8: Analysis of Consumerism and Consumers Rights. Ιφιγένεια Μαχίλη Τμήμα Οικονομικών Επιστημών

ΚΥΠΡΙΑΚΗ ΕΤΑΙΡΕΙΑ ΠΛΗΡΟΦΟΡΙΚΗΣ CYPRUS COMPUTER SOCIETY ΠΑΓΚΥΠΡΙΟΣ ΜΑΘΗΤΙΚΟΣ ΔΙΑΓΩΝΙΣΜΟΣ ΠΛΗΡΟΦΟΡΙΚΗΣ 19/5/2007

Τ.Ε.Ι. ΔΥΤΙΚΗΣ ΜΑΚΕΔΟΝΙΑΣ ΠΑΡΑΡΤΗΜΑ ΚΑΣΤΟΡΙΑΣ ΤΜΗΜΑ ΔΗΜΟΣΙΩΝ ΣΧΕΣΕΩΝ & ΕΠΙΚΟΙΝΩΝΙΑΣ

Η ΠΡΟΣΩΠΙΚΗ ΟΡΙΟΘΕΤΗΣΗ ΤΟΥ ΧΩΡΟΥ Η ΠΕΡΙΠΤΩΣΗ ΤΩΝ CHAT ROOMS

Concrete Mathematics Exercises from 30 September 2016

Living and Nonliving Created by: Maria Okraska

ΕΘΝΙΚΗ ΣΧΟΛΗ ΗΜΟΣΙΑΣ ΙΟΙΚΗΣΗΣ

Terabyte Technology Ltd

LESSON 14 (ΜΑΘΗΜΑ ΔΕΚΑΤΕΣΣΕΡΑ) REF : 202/057/34-ADV. 18 February 2014

The Simply Typed Lambda Calculus

HIV HIV HIV HIV AIDS 3 :.1 /-,**1 +332

ΓΗΑΠΑΝΔΠΗΣΖΜΗΑΚΟ ΓΗΑΣΜΖΜΑΣΗΚΟ ΠΡΟΓΡΑΜΜΑ ΜΔΣΑΠΣΤΥΗΑΚΧΝ ΠΟΤΓΧΝ ΣΔΥΝΟΛΟΓΗΔ ΣΖ ΠΛΖΡΟΦΟΡΗΑ ΚΑΗ ΣΖ ΔΠΗΚΟΗΝΧΝΗΑ ΓΗΑ ΣΖΝ ΔΚΠΑΗΓΔΤΖ ΓΙΠΛΧΜΑΣΙΚΗ ΔΡΓΑΙΑ

F-TF Sum and Difference angle

Potential Dividers. 46 minutes. 46 marks. Page 1 of 11

Ψηφιακή ανάπτυξη. Course Unit #1 : Κατανοώντας τις βασικές σύγχρονες ψηφιακές αρχές Thematic Unit #1 : Τεχνολογίες Web και CMS

Επιβλέπουσα Καθηγήτρια: ΣΟΦΙΑ ΑΡΑΒΟΥ ΠΑΠΑΔΑΤΟΥ

ΤΕΧΝΟΛΟΓΙΚΟ ΕΚΠΑΙΔΕΥΤΙΚΟ ΙΔΡΥΜΑ ΠΕΛΟΠΟΝΝΗΣΟΥ

ΜΟΝΤΕΛΑ ΛΗΨΗΣ ΑΠΟΦΑΣΕΩΝ

ΠΑΡΟΥΣΙΑΣΗ ΙΔΕΠ ΣΥΜΒΟΥΛΕΣ ΓΙΑ ΣΩΣΤΗ ΔΙΑΧΕΙΡΙΣΗ ΕΡΓΩΝ ERASMUS+ STRATEGIC PARTNERSHIPS

2nd Training Workshop of scientists- practitioners in the juvenile judicial system Volos, EVALUATION REPORT

Lecture 2. Soundness and completeness of propositional logic

Approximation of distance between locations on earth given by latitude and longitude

Τέσσερις καλές πρακτικές για την ανάπτυξη λογισμικού στην Ανοιχτή Επιστήμη. Φώτης Ε. Ψωμόπουλος, Ερευνητής Γ ΙΝΕΒ ΕΚΕΤΑ

Paper Reference. Paper Reference(s) 1776/04 Edexcel GCSE Modern Greek Paper 4 Writing. Thursday 21 May 2009 Afternoon Time: 1 hour 15 minutes

ΚΥΠΡΙΑΚΟΣ ΣΥΝΔΕΣΜΟΣ ΠΛΗΡΟΦΟΡΙΚΗΣ CYPRUS COMPUTER SOCIETY 21 ος ΠΑΓΚΥΠΡΙΟΣ ΜΑΘΗΤΙΚΟΣ ΔΙΑΓΩΝΙΣΜΟΣ ΠΛΗΡΟΦΟΡΙΚΗΣ Δεύτερος Γύρος - 30 Μαρτίου 2011

Section 8.3 Trigonometric Equations

ΑΠΟΔΟΤΙΚΗ ΑΠΟΤΙΜΗΣΗ ΕΡΩΤΗΣΕΩΝ OLAP Η ΜΕΤΑΠΤΥΧΙΑΚΗ ΕΡΓΑΣΙΑ ΕΞΕΙΔΙΚΕΥΣΗΣ. Υποβάλλεται στην

ΔΙΠΛΩΜΑΤΙΚΗ ΕΡΓΑΣΙΑ. Τα γνωστικά επίπεδα των επαγγελματιών υγείας Στην ανοσοποίηση κατά του ιού της γρίπης Σε δομές του νομού Λάρισας

Πανεπιστήμιο Δυτικής Μακεδονίας. Τμήμα Μηχανικών Πληροφορικής & Τηλεπικοινωνιών. Ηλεκτρονική Υγεία

ΠΤΥΧΙΑΚΗ ΕΡΓΑΣΙΑ "ΠΟΛΥΚΡΙΤΗΡΙΑ ΣΥΣΤΗΜΑΤΑ ΛΗΨΗΣ ΑΠΟΦΑΣΕΩΝ. Η ΠΕΡΙΠΤΩΣΗ ΤΗΣ ΕΠΙΛΟΓΗΣ ΑΣΦΑΛΙΣΤΗΡΙΟΥ ΣΥΜΒΟΛΑΙΟΥ ΥΓΕΙΑΣ "

Οδηγίες χρήσης. Registered. Οδηγίες ένταξης σήματος D-U-N-S Registered στην ιστοσελίδα σας και χρήσης του στην ηλεκτρονική σας επικοινωνία

Phys460.nb Solution for the t-dependent Schrodinger s equation How did we find the solution? (not required)

ΟΡΟΛΟΓΙΑ - ΞΕΝΗ ΓΛΩΣΣΑ

Math 6 SL Probability Distributions Practice Test Mark Scheme

ΚΥΠΡΙΑΚΗ ΕΤΑΙΡΕΙΑ ΠΛΗΡΟΦΟΡΙΚΗΣ CYPRUS COMPUTER SOCIETY ΠΑΓΚΥΠΡΙΟΣ ΜΑΘΗΤΙΚΟΣ ΔΙΑΓΩΝΙΣΜΟΣ ΠΛΗΡΟΦΟΡΙΚΗΣ 6/5/2006

department listing department name αχχουντσ ϕανε βαλικτ δδσϕηασδδη σδηφγ ασκϕηλκ τεχηνιχαλ αλαν ϕουν διξ τεχηνιχαλ ϕοην µαριανι

Συντακτικές λειτουργίες

ΠΤΥΧΙΑΚΗ ΕΡΓΑΣΙΑ ΒΑΛΕΝΤΙΝΑ ΠΑΠΑΔΟΠΟΥΛΟΥ Α.Μ.: 09/061. Υπεύθυνος Καθηγητής: Σάββας Μακρίδης

Επιχειρηματικότητα και Εκπαίδευση. Ανάπτυξη Ικανοτήτων Μαθητών 12 Δεκεμβρίου, 2015

ΙΑΤΜΗΜΑΤΙΚΟ ΠΡΟΓΡΑΜΜΑ ΜΕΤΑΠΤΥΧΙΑΚΩΝ ΣΠΟΥ ΩΝ ΣΤΗ ΙΟΙΚΗΣΗ ΕΠΙΧΕΙΡΗΣΕΩΝ. ιπλωµατική Εργασία. της ΘΕΟ ΟΣΟΠΟΥΛΟΥ ΕΛΕΝΗΣ ΜΣ:5411

ΑΓΓΛΙΚΑ IV. Ενότητα 1γ: Deciding on a topic, purpose, rationale. Ιφιγένεια Μαχίλη Τμήμα Διεθνών & Ευρωπαϊκών Σπουδών

Από τη θεωρία στην πράξη στη Τριτοβάθμια εκπαίδευση

Αξιολόγηση των εκπαιδευτικών δραστηριοτήτων των νοσοκομειακών βιβλιοθηκών.

ICTR 2017 Congress evaluation A. General assessment

Ψηφιακή ανάπτυξη. Course Unit #1 : Κατανοώντας τις βασικές σύγχρονες ψηφιακές αρχές Thematic Unit #1 : Τεχνολογίες Web και CMS

ΠΑΝΕΠΙΣΤΗΜΙΟ ΠΕΙΡΑΙΑ ΤΜΗΜΑ ΝΑΥΤΙΛΙΑΚΩΝ ΣΠΟΥΔΩΝ ΠΡΟΓΡΑΜΜΑ ΜΕΤΑΠΤΥΧΙΑΚΩΝ ΣΠΟΥΔΩΝ ΣΤΗΝ ΝΑΥΤΙΛΙΑ

How to register an account with the Hellenic Community of Sheffield.

C.S. 430 Assignment 6, Sample Solutions

CHAPTER 25 SOLVING EQUATIONS BY ITERATIVE METHODS

Development of the Nursing Program for Rehabilitation of Woman Diagnosed with Breast Cancer

ΔΘΝΙΚΗ ΥΟΛΗ ΓΗΜΟΙΑ ΓΙΟΙΚΗΗ ΚΑ ΔΚΠΑΙΓΔΤΣΙΚΗ ΔΙΡΑ ΣΔΛΙΚΗ ΔΡΓΑΙΑ

Dynamic types, Lambda calculus machines Section and Practice Problems Apr 21 22, 2016

Οδηγίες χρήσης υλικού D U N S Registered

[1] P Q. Fig. 3.1

Objectives-Στόχοι: -Helping your Child become a fantastic language learner «Βοηθώντας το παιδί σας να γίνει εξαιρετικό στην εκμάθηση γλωσσών» 6/2/2014

Right Rear Door. Let's now finish the door hinge saga with the right rear door

BECAUSE WE REALLY WANT TO KNOW WHAT YOU THINK ABOUT SCHOOL AND YOUR GARDEN. Fairly true If I decide to learn something hard, I can.

UNIVERSITY OF CALIFORNIA. EECS 150 Fall ) You are implementing an 4:1 Multiplexer that has the following specifications:

S. Gaudenzi,. π υ, «aggregation problem»

ΕΛΛΗΝΙΚΗ ΔΗΜΟΚΡΑΤΙΑ Ανώτατο Εκπαιδευτικό Ίδρυμα Πειραιά Τεχνολογικού Τομέα. Ξένη Ορολογία. Ενότητα 6: Working Capital

Αζεκίλα Α. Μπνπράγηεξ (Α.Μ. 261)

Transcript:

ILDE content created during the OU workshop D4.1 Pilot workshops and enactments Design Package 1

Brainstorm 2: learning outcomes and outputs (Science) (Factors and concerns) Started on 24 October 2013 by Jonathan Martyn Latest revision on 25 October 2013 by Jonathan Martyn 2 revisions D4.1 Pilot workshops and enactments Design Package 2

Brainstorm 2: learning outcomes and outputs (Science) D4.1 Pilot workshops and enactments Design Package 3

Morning brainstorming: barriers and solutions (Science) (Factors and concerns) Started on 24 October 2013 by Jonathan Martyn Latest revision on 25 October 2013 by Jonathan Martyn 3 revisions Morning brainstorming: barriers and solutions (Science) D4.1 Pilot workshops and enactments Design Package 4

D4.1 Pilot workshops and enactments Design Package 5

D4.1 Pilot workshops and enactments Design Package 6

NOTE: This Support Document was attached to all of the following Heuristic Evaluation entries Support Document Support document This work by Yishay Mor is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License. Heuristic evaluation has been recognised as a powerful technique for evaluating learning design. However, in order to adapt it to this use, the designer / researcher needs to define a protocol and a set of Heuristics. In a heuristic evaluation, a group of experts (usually 5-7) are asked to walk through the evaluated system as if they were users (learners) engaged in a typical activity. The experts are presented with a set of design heuristics - rules of thumb against which they are asked to assess their experience. Often they are provided with a score sheet, where they are asked to note any violation of these heuristics and rate its severity. This template is designed to simplify the process of conducting a heuristic evaluation. You should use it to produce the evaluation protocol you will present to your experts. Create a document from this template, and edit it - replacing the guidance notes with your own text. The final product should be sufficient to guide and support an expert in evaluating your project. Start by editing the following paragraph, to introduce the concept of heuristic evaluation to your experts, work through the document, and end by deleting these two paragraphs. Your task Here is an example of evaluator instructions from the nquire Moon Rock Demonstrator project, developed for the Wolfson OpenScience Laboratory: You'll play the role of a member of the general public who: (A) has an interest in science; (B) has no previous experience on nquire nor inquiry learning; (C) has no expertise on geology. The activity is not supervised. Therefore, the system should be sufficient to guide you through the inquiry. The experimenter will only provide guidance when the evaluator needs help to complete the process. The evaluator must complete the following process: 1 Access the website: http://pontos.open.ac.uk/nquire/latest 2 Create a new user and log in. 3 Complete the Moon rock comparison inquiry. Some activities have links to external sites; these are not evaluated here D4.1 Pilot workshops and enactments Design Package 7

Heuristics Below are exemplar four sets of heuristics. While the overall emphasis is on usability, they all consider factors of learning design and content design as well. There is considerable overlap between these sets, and yet each one includes some unique heuristics. It is also worth considering the different ways of representing the heuristics to the evaluators. You can use one of these sets, pick and choose from them, or develop your own Heuristics. Bealle and Sharples (2002) (used by the nquire Moon Rock Demonstrator project, developed for the Wolfson OpenScience Laboratory) Feedback: inform the user about what is going on using appropriate feedback in a timely manner. Everyday language: use simple language, avoid technical terms, follow real-world conventions to make things appear logical. Undo: people make mistakes, so it should be easy to recover to a sensible point. Consistency: doing similar things in similar places should have similar effects. Also, support the conventions of the specific types of computer and operating systems such as Windows or MacOS. Recognition not recall: make next steps and critical information visible and memorable. Allow people to recognise what they should do next, not remember what it is. Simple design: keep things crisp and simple, to minimise the information presented to the user. Make the design aesthetically pleasing to the target audience. Expert use: provide accelerators (keyboard short cuts and advanced techniques) that allow experts to work faster. Error recovery: try and design the system to prevent errors occurring, and when they do provide clear messages and suggest appropriate solutions. Documentation: it is best to design a system that requires no documentation, but complex features or very different systems may need it. It should be well organised, (searchable and well structured), focussed on the task of the user, simple to follow with concrete steps, and concise. It should ideally be available on the system so it is accessible when needed. D4.1 Pilot workshops and enactments Design Package 8

Ssemugabi & de Villiers (2010) Category 1: General interface usability criteria (based on Nielsen s heuristics, modified for e-learning context) 1 Visibility of system status The website keeps the user informed through constructive, appropriate and timely feedback. The system responds to user-initiated actions. There are no surprise actions by the site or tedious data entry sequences. 2 Match between the system and the real world i.e. match between designer model and user model Language usage in terms of phrases, symbols, and concepts is similar to that of users in their day-to-day environment. Metaphor usage corresponds to real-world objects/concepts, e.g. understandable and meaningful symbolic representations are used to ensure that the symbols, icons and names are intuitive within the context of the task performed. Information is arranged in a natural and logical order. 3 Learner control and freedom Users control the system. Users can exit the system at any time, even when they have made mistakes. There are facilities for Undo and Redo. 4 Consistency and adherence to standards The same concepts, words, symbols, situations, or actions refer to the same thing. Common platform standards are followed. 5 Error prevention, in particular, prevention of peripheral usability-related errors The system is designed in such a way that the users cannot easily make serious errors. When a user makes an error, the application gives an appropriate error message. 6 Recognition rather than recall Objects to be manipulated, options for selection, and actions to be taken are visible. The user does not need to recall information from one part of a dialogue to another. Instructions on how to use the system are visible or easily retrievable whenever appropriate. Displays are simple and multiple page displays are minimised. 7 Flexibility and efficiency of use The site caters for different levels of users, from novice to expert. Shortcuts or accelerators, unseen by novice users, are provided to speed up interaction and task completion by frequent users. The system is flexible to enable users to adjust settings to suit themselves, i.e. to customise the system. D4.1 Pilot workshops and enactments Design Package 9

8 Aesthetics and minimalism in design Site dialogues do not contain irrelevant or rarely needed information, which could distract users. 9 Recognition, diagnosis, and recovery from errors Error messages are expressed in plain language. Error messages define problems precisely and give quick, simple, constructive, specific instructions for recovery. If a typed command results in an error, users need not retype the entire command, but only the faulty part. 10 Help and documentation The site has a help facility and other documentation to support users needs. Information in these facilities is easy to search, task-focused, and lists concrete steps to accomplish a task. Category 2: Website-specific criteria 11 Simplicity of site navigation, organisation and structure The site has a simple navigational structure. Users should know where they are and have the option to select where to go next, e.g. via a site map or breadcrumbs. The navigational options are limited, so as not to overwhelm the user. Related information is placed together. Information is organised hierarchically, moving from the general to the specific. Common browser standards are followed. Each page has the required navigation buttons or hyperlinks (links), such as previous (back) next and home. 12 Relevance of site content to the learner and the learning process Content is engaging, relevant, appropriate and clear to learners using the WBL site. The material has no biases such as racial and gender biases, which may be deemed offensive. It is clear which materials are copyrighted and which are not. The authors of the content are of reputable authority. Category 3: Educational criteria: Learner-centred instructional design, grounded in learning theory 13 Clarity of goals, objectives and outcomes There are clear goals, objectives and outcomes for learning encounters. The reason for inclusion of each page or document on the site is clear. 14 Effectiveness of collaborative learning (where such is available) Facilities and activities are available that encourage learner-learner and learner-teacher interactions. Facilities are provided for both asynchronous and synchronous communication, such as e-mail, discussion forums and chat rooms. D4.1 Pilot workshops and enactments Design Package 10

15 Level of learner control Apart from controlling the interactions with the site, learners have some freedom to direct their learning, either individually or collaboratively, and to have a sense of ownership of it. Learners are given some control of the content they learn, how it is learned, and the sequence of units. Individual learners can customise the site to suit their personal learning strategies. Educators can customise learning artefacts to the individual learner, for example, tests and performance evaluations can be customised to the learner s ability. Where appropriate, learners take the initiative regarding the methods, time, place, content, and sequence of learning. 16 Support for personally significant approaches to learning There are multiple representations and varying views of learning artefacts and tasks. The site supports different strategies for learning and indicates clearly which styles it supports. The site is used in combination with other mediums of instruction to support learning. Metacognition (the ability of a learner to plan, monitor and evaluate his/her own cognitive skills) is encouraged. Learning activities are scaffolded by learner support and by optional additional information. 17 Cognitive error recognition, diagnosis and recovery Cognitive conflict, bridging and problem-based learning strategies are used in the recognition-diagnosis-recovery cycle. Learners have access to a rich and complex environment in which they can explore different solutions to problems. Learners are permitted to learn by their mistakes and are provided with help to recover from cognitive errors. 18 Feedback, guidance and assessment Apart from the system s interface-feedback by the system, considered under Criterion 1, learners give and receive prompt and frequent feedback about their activities and the knowledge being constructed. Learners are guided as they perform tasks. Quantitative feedback, e.g. grading of learners activities, is given, so that learners are aware of their level of performance. 19 Context meaningful to domain and learner Knowledge is presented within a meaningful and authentic context that supports effective learning. Authentic, contextualised tasks are undertaken rather than abstract instruction. The application enables context- and content-dependent knowledge construction. Learning occurs in a context of use so that knowledge and skills are transferable to similar contexts. The representations are understandable and meaningful, ensuring that symbols, icons and names used are intuitive within the context of the learning task. 20 Learner motivation, creativity and active learning The site has content and interactive features that attract, motivate and retain learners, and that promote creativity, e.g. the online activities are situated in real-world practice, and interest and engage the learners. D4.1 Pilot workshops and enactments Design Package 11

To promote active learning and critical thinking, tasks require learners to compare, analyse and classify information, and to make deductions. Albion (1999 Interface design heuristics [after Nielsen (1994)] Ensures visibility of system status Maximises match between the system and the real world Maximises user control and freedom Maximises consistency and matches standards The software keeps the user informed about what is going on through appropriate and timely feedback. The software speaks the users' language rather than jargon. Information appears in a natural and logical order. Users are able to exit locations and undo mistakes. Users do not have to wonder whether different words, situations or actions mean the same thing. Common operating system standards are followed. Prevents errors The design provides guidance which reduces the risk of user errors. Supports recognition rather than recall Objects, actions and options are visible. The user does not have to rely on memory. Information is visible or easily accessed whenever appropriate. Supports flexibility and efficiency of use Uses aesthetic and minimalist design Helps users recognise, diagnose and recover from errors Provides help and documentation The software allows experienced users to use shortcuts and adjust settings to suit. The software provides an appealing overall design and does not display irrelevant or infrequently used information. Error messages are expressed in plain language, clearly indicate the problem and recommend a solution. The software provides appropriate online help and documentation which is easily accessed and related to the users' needs. Educational design heuristics [after Quinn (1996)] Clear goals and objectives Context meaningful to domain and learner Content clearly and multiply represented and multiply navigable Activities scaffolded Elicit learner understandings Formative evaluation Performance should be 'criteria- The software makes it clear to the learner what is to be accomplished and what will be gained from its use. The activities in the software are situated in practice and will interest and engage a learner. The message in the software is unambiguous. The software supports learner preferences for different access pathways. The learner is able to find relevant information while engaged in an activity. The software provides support for learner activities to allow working within existing competence while encountering meaningful chunks of knowledge. The software requires learners to articulate their conceptual understandings as the basis for feedback. The software provides learners with constructive feedback on their endeavours. The software will produce clear and measurable D4.1 Pilot workshops and enactments Design Package 12

referenced' Support for transference and acquiring 'self-learning' skills Support for collaborative learning outcomes that would support competency-based evaluation. The software supports transference of skills beyond the learning environment and will facilitate the learner becoming able to self-improve. The software provides opportunities and support for learning through interaction with others through discussion or other collaborative activities. Content heuristics Establishment of context Relevance to professional practice The photographs, documents and other materials related to the simulated schools create a sense of immersion in a simulated reality. The problem scenarios and included tasks are realistic and relevant to the professional practice of teachers. Representation of professional responses to issues Relevance of reference materials Presentation of video resources Assistance is supportive rather than prescriptive Materials are engaging Presentation of resources Overall effectiveness of materials The sample solutions represent a realistic range of teacher responses to the issues and challenge users to consider alternative approaches. The reference materials included in the package are relevant to the problem scenarios and are at a level appropriate to the users. The video clips of teacher interviews and class activities are relevant and readily accessible to the user. The contextual help supports the user in locating relevant resources and dealing with the scenarios without restricting the scope of individual responses. The presentation style and content of the software encourages a user to continue working through the scenarios. The software presents useful resources for teacher professional development in an interesting and accessible manner. The materials are likely to be effective in increasing teachers' confidence and capacity for integrating information technology into teaching and learning. Benson et al (2001) 1. Visibility of system status: The e-learning program keeps the learner informed about what is happening, through appropriate feedback within reasonable time. Sample questions to ask yourself: a. Does the learner know where they are at all times, how they got there, and how to get back to the point from which they started? b. When modules and other components of the e-learning (e.g., streaming video) are loading, is the status of the upload communicated clearly? c. Does the learner have confidence that the e-learning program is operating the way it was designed to operate? D4.1 Pilot workshops and enactments Design Package 13

2. Match between system and the real world: The e-learning programís interface employs words, phrases and concepts familiar to the learner, rather than system-oriented terms. Wherever possible, the e-learning program utilizes real-world conventions that make information appear in a natural and logical order. Sample questions to ask yourself: a. Does the e-learning programís navigation and interactive design utilize metaphors that are familiar to the learner either in terms of traditional learning environments (e.g., lectures, quizzes, etc.) or in terms related to the specific content of the program? b. Is the cognitive load of the interface as low as possible to enable learners to engage with the content, tasks, and problems as quickly as possible? c. Does the e-learning program adhere to good principles of human information processing? 3. User control and freedom: The e-learning program allows the learner to recover from input mistakes and provides a clearly marked ìemergency exitî to leave an unwanted state without having to go through an extended dialogue. Sample questions to ask yourself: a. Does the e-learning program allow the learner to move around in the program in an unambiguous manner, including the capability to go back and review previous sections? b. Does the e-learning program allow the learner to leave whenever desired, but easily return to the closest logical point in the program? c. Does the e-learning program distinguish between input errors and cognitive errors, allowing easy recovery from the former always, and from the latter when it is pedagogically appropriate? 4. Consistency and standards: The e-learning program is consistent in its use of different words, situations, or actions and it adheres to general software and platform conventions. Sample questions to ask yourself: a. Does the e-learning program function properly as long as the computerís screen resolution, memory allocations, bandwidth, browsers, plug-ins, and other technical aspects meet the required specifications? b. Does the e-learning program include interactions that are counter-intuitive with respect to common software conventions? c. Does the e-learning product adhere to widely recognized standards for interactions (e.g., going back in a Web browser)? 5. Error prevention: The e-learning program is carefully designed to prevent common problems from occurring in the first place. Sample questions to ask yourself: a. Is the e-learning program designed so that the learner recognizes when he/she has made a mistake related to input rather than content? b. Is the e-learning program designed to take advantage of screen design conventions and guidelines that clarify meaning? c. Is the e-learning program designed to provide a second chance when unexpected input is received (e.g., ìyou typed ìbatî in response to the question. Did you mean ìtab?î)? 6. Recognition rather than recall: The e-learning program makes objects, actions, and options visible so that the user does not have to remember information from one part of the program to another. Instructions for use of the program are visible or easily retrievable. Sample questions to ask yourself: D4.1 Pilot workshops and enactments Design Package 14

a. Does the interface of the e-learning program speak for itself so that extensive consultation of a manual or other documentation does not interfere with learning? b. Are icons and other screen elements designed so that they are as intuitive as possible? c. Does the e-learning program provide user-friendly hints and/or clear directions when the learner requests assistance? 7. Flexibility and efficiency of use: The e-learning program is designed to speed up interactions for the experienced learner, but also cater to the needs of the inexperienced learner. Sample questions to ask yourself: a. Is the e-learning program designed to make the best use of useful graphics and other media elements that download as quickly as possible? b. Is the e-learning program designed to allow large media files to be downloaded in advance so that learner wait time is minimized? c. Does the program allow keyboard short cuts that make frequent interactions as efficient as possible? 8. Aesthetic and minimalist design: Screen displays do not contain information that is irrelevant, and ìbells and whistlesî are not gratuitously added to the e-learning program. Sample questions to ask yourself: a. Are the font choices, colors, and sizes consistent with good screen design recommendations for e-learning programs? b. Are extra media features (e.g., streaming video) in the e-learning program supportive of learning, motivation, content, or other goals? c. Does the e-learning program utilize white space and other screen design conventions appropriately? 9. Help users recognize, diagnose, and recover from errors: The e-learning program expresses error messages in plain language (without programmer codes), precisely indicates the problem, and constructively suggests a solution. Sample questions to ask yourself: a. Does the learner receive meaningful feedback concerning the nature of any input they make into the program? b. If the learner answers a question incorrectly, is he/she told the correct answer and why the answer given was wrong, if this is instructionally appropriate? c. When feedback is provided, is it given in a clear, direct, and friendly (non-condescending) manner? 10. Help and documentation: When it is absolutely necessary to provide help and documentation, the e-learning program provides any such information in a manner that is easy to search. Any help provided is focused on the learner's task, lists concrete steps to be carried out, and is not be too large. Sample questions to ask yourself: a. Is help provided that is screen or context specific? b. Is help and documentation available from any logical part of the e-learning program? c. Does the e-learning program include a map or table of contents that allows you to see what you have seen and not seen? 11. Interactivity: The e-learning program provides content-related interactions and tasks that support meaningful learning. Sample questions to ask yourself: D4.1 Pilot workshops and enactments Design Package 15

a. Does the e-learning program provide too many long sections of text to read without meaningful interactions? b. Does the e-learning engage the learner in content-specific tasks to complete and problems to solve that take advantage of the state-ofthe-art of e-learning design? c. Does the e-learning program provide a level of experiential learning congruent with the content and capabilities of the target audience? 12. Message Design: The e-learning program presents information in accord with sound principles of informationprocessing theory. Sample questions to ask yourself: a. Is the most important information on the screen placed in the areas most likely to attract the learnerís attention? b. Does the e-learning program follow good information presentation guidelines with respect to organization and layout? c. Are graphics in the e-learning program used to clarify content, motivate, or serve other pedagogical goals? 13. Learning Design: The interactions in the e-learning program have been designed in accord with sound principles of learning theory. Sample questions to ask yourself: a. Does the e-learning program provide for instructional interactions that reflect sound learning theory? b. Does the e-learning program engage learners in tasks that are closely aligned with the learning goals and objectives? c. Does the e-learning program inform learners of the objectives of the program and remind them of prior learning? 14. Assessment: The e-learning program provides assessment opportunities that are aligned with the program objectives and content. Sample questions to ask yourself: a. Does the e-learning program provide opportunities for self-assessments that advance learner achievement? b. If appropriate to the context, do assessments provide sufficient feedback to the learner to provide remedial directions? c. Are higher order assessments (e.g., analysis, synthesis, and evaluation) provided wherever appropriate rather than lower order assessments (e.g., recall and recognition)? 15. Media Integration: The inclusion of media in the e-learning program serves clear pedagogical and/or motivational purposes. Sample questions to ask yourself: a. Is media included that is obviously superfluous, i.e., lacking a strong connection to the objectives and design of the program? b. Is the most appropriate media selected to match message design guidelines or to support specific instructional design principles? c. If appropriate to the context, are various forms media included for remediation and/or enrichment? 16. Resources: The e-learning program provides access to all the resources necessary to support effective learning. Sample questions to ask yourself: a. Does the e-learning program provide access to a range of resources (e.g., examples or real data archives) appropriate to the learning context? b. If the e-learning program includes links to external World Wide Web or Intranet resources, are the links kept up-to-date? D4.1 Pilot workshops and enactments Design Package 16

c. Are resources provided in a manner that replicates as closely as possible their availability and use in the real world? 17. Performance Support Tools: The e-learning program provides access to performance support tools that are relevant to the content and objectives. Sample questions to ask yourself: a. Are performance support tools provided that mimic their access in the real world? b. Provided the context is appropriate, does the e-learning program provide sufficient search capabilities? c. Provided the context is appropriate, does the e-learning program provide access to peers, experts, instructors, and other human resources? 18. Learning Management: The e-learning program enables learners to monitor their progress through the material. Sample questions to ask yourself: a. Does the learner know what he/she is doing and how he/she is doing within various parts of the e-learning program? b. Does the learner perceive options for additional guidance, instruction, or other forms of assistance when it is needed? c. Does the learner possess an adequate understanding of what he/she has completed and what remains to be done within any specific unit (e.g., a course) of e-learning? 19. Feedback: The e-learning program provides feedback that is contextual and relevant to the problem or task in which the learner is engaged. Sample questions to ask yourself: a. Is the feedback given at any specific time tailored to the content being studied, problem being solved, or task being completed by the learner? b. Does feedback provide the learner with information concerning his/her current level of achievement within the program? c. Does the e-learning program provide learners with opportunities to access extended feedback from instructors, experts, peers, or others through e-mail or other Internet communications? 20. Content: The content of the e-learning program is organized in a manner than is clear to the learner. Sample questions to ask yourself: a. Is the content organized in manageable modules or other types of units? b. Is the content broken to appropriate chunks so that learners can process them without too much cognitive load? c. Does the e-learning program provides advanced organizers, summaries, and other components that foster more efficient and effective learning? D4.1 Pilot workshops and enactments Design Package 17

References Beale, R. & Sharples, M. (2002), 'Design guide for developers of educational software', British Educational Communications and Technology Agency.www.eee.bham.ac.uk/sharplem/Papers/Design%20Guide.pdf Ssemugabi, S. & de Villiers, R. (2010), 'Effectiveness of heuristic evaluation in usability evaluation of e-learning applications in higher education', South African Computer Journal 45 (0). http://sacj.cs.uct.ac.za/index.php/sacj/article/view/37 Hagen, P.; Robertson, T.; Kan, M. & Sadler, K. (2005), 'Emerging research methods for understanding mobile technology use' 'Proceedings of the 17th Australia conference on Computer-Human Interaction: Citizens Online: Considerations for Today and the Future', Computer-Human Interaction Special Interest Group (CHISIG) of Australia, 1-10. research.it.uts.edu.au/idhup/wordpress/wp-content/uploads/2009/10/hagen_ozchi2005.pdf.pdf Kjeldskov, J.; Graham, C.; Pedell, S.; Vetere, F.; Howard, S.; Balbo, S. & Davies, J. (2005), 'Evaluating the usability of a mobile guide: The influence of location, participants and resources',behaviour and Information Technology 24 (1), 51-66. disweb.dis.unimelb.edu.au/staff/showard/papers/bit2005.pdf Reeves, T. C.; Benson, L.; Elliott, D.; Grant, M.; Holschuh, D.; Kim, B.; Kim, H.; Lauber, E. & Loh, C. S. (2002), Usability and Instructional Design Heuristics for E-Learning Evaluation, in P. Barker & S. Rebelsky, ed., in 'Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2002 (pp. 1615-1621).', AACE, Chesapeake, VA. http://www.csloh.com/research/pdf/edmedia2002.pdf,http://treeves.coe.uga.edu//edit8350/heipep.html Albion, P. (1999), Heuristic evaluation of educational multimedia: from theory to practice, in 'Proceedings ASCILITE 1999: 16th Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education: Responding to Diversity', pp. 9--15. Nielsen, J. (1994), Heuristic evaluation, John Wiley & Sons, Inc., New York, NY, USA, pp. 25-62. Nielsen, J. (1994), 'How to conduct a heuristic evaluation'. http://www.useit.com/papers/heuristic/heuristic_evaluation.html Nielsen, J. & Molich, R. (1990), Heuristic evaluation of user interfaces, in 'CHI '90: Proceedings of the SIGCHI conference on Human factors in computing systems', ACM Press, New York, NY, pp. 249-256 D4.1 Pilot workshops and enactments Design Package 18

FBL heuristics (Heuristic Evaluation) Started on 24 October 2013 by Martin Friel Latest revision on 24 October 2013 by Martin Friel 2 revisions FBL heuristics Heuristic Evaluation of e-learning: FBL collaborative activity See the support document for instructions for using this template. Heuristic evaluation is a technique borrowed from usability research, where a group of experts is asked to assess a particular design using a given rubric (set of heuristics). It offers a low-cost rapid evaluation which often uncovers design flaws at an early stage. Thank you for agreeing to help us evaluate FBL collaborative activity. Today you will be asked to put yourself in the learner s seat, review the project, and note any design flaws you perceive. Introducing FBL collaborative activity Add a paragraph or two introducing your LdS: what are you creating, who is the target audience, what do you aim to achieve? Your task Instruct the evaluator how to perform the evaluation. Tell them what role they assume, and what actions they should take. Often, the instructions suggest that the evaluator first does a quick pass through the evaluated system, then goes through it systematically and monitors the heuristics (below). You might want to suggest that evaluators take a quick note of the design flaws they notice, then return to their score sheet at the end and fill in the gaps. The Heuristics Present the evaluator with a set of heuristics to apply as they review your LdS. Review the heuristics sets in the appendix. Select the ones that are relevant to your LdS, edit them to better fit your context, and add your own. Heuristics - Need for synchronous and asynchronous interactions - Diagnostics (such as a quiz) to get an idea of level/experience/knowledge etc - Sharing of formal and informal work experiences - Thinking through time constraints: time required and different time zones - Confidence-building activities in initial stages of module - Developing proxies for work experience Scoring sheet Provide evaluators with a scoring sheet to use during the evaluation. D4.1 Pilot workshops and enactments Design Package 19

Location Issue Heuristic Severity Recommended action (optional) Where was the issue noticed? Describe the issue that you noticed Which heuristic does it violate? How bad is it (0-5)? 0 - not a problem 5 - catastrophic, show stopper. Suggest how to rectify this issue D4.1 Pilot workshops and enactments Design Package 20

HSC activity - heuristic evaulation (Heuristic Evaluation) Started on 24 October 2013 by Rebecca Jones 1 revision HSC activity - heuristic evaluation Heuristic Evaluation of e-learning: <LdS> See the support document for instructions for using this template. Heuristic evaluation is a technique borrowed from usability research, where a group of experts is asked to assess a particular design using a given rubric (set of heuristics). It offers a low-cost rapid evaluation which often uncovers design flaws at an early stage. Thank you for agreeing to help us evaluate <project name>. Today you will be asked to put yourself in the learner s seat, review the project, and note any design flaws you perceive. Introducing <LdS> Add a paragraph or two introducing your LdS: what are you creating, who is the target audience, what do you aim to achieve? Your task Instruct the evaluator how to perform the evaluation. Tell them what role they assume, and what actions they should take. Often, the instructions suggest that the evaluator first does a quick pass through the evaluated system, then goes through it systematically and monitors the heuristics (below). You might want to suggest that evaluators take a quick note of the design flaws they notice, then return to their score sheet at the end and fill in the gaps. The Heuristics Clear benefit to students Build skills gradually LOs clear Timings clear Scoring sheet Provide evaluators with a scoring sheet to use during the evaluation. Location Issue Heuristic Severity Recommended action (optional) Where was the issue noticed? Describe the issue that you noticed Which heuristic does it violate? How bad is it (0-5)? 0 - not a problem Suggest how to rectify this issue D4.1 Pilot workshops and enactments Design Package 21

Timings aren't specified enough Be clear about time needed 5 - catastrophic, show stopper. 1 Add this! D4.1 Pilot workshops and enactments Design Package 22

MCT heuristics (Heuristic Evaluation) Started on 24 October 2013 by Jane Collinson 1 revision MCT heuristics Heuristic Evaluation of e-learning: <LdS> See the support document for instructions for using this template. Heuristic evaluation is a technique borrowed from usability research, where a group of experts is asked to assess a particular design using a given rubric (set of heuristics). It offers a low-cost rapid evaluation which often uncovers design flaws at an early stage. Thank you for agreeing to help us evaluate <project name>. Today you will be asked to put yourself in the learner s seat, review the project, and note any design flaws you perceive. Introducing <LdS> Add a paragraph or two introducing your LdS: what are you creating, who is the target audience, what do you aim to achieve? Your task Instruct the evaluator how to perform the evaluation. Tell them what role they assume, and what actions they should take. Often, the instructions suggest that the evaluator first does a quick pass through the evaluated system, then goes through it systematically and monitors the heuristics (below). You might want to suggest that evaluators take a quick note of the design flaws they notice, then return to their score sheet at the end and fill in the gaps. The Heuristics Present the evaluator with a set of heuristics to apply as they review your LdS. Review the heuristics sets in the appendix. Select the ones that are relevant to your LdS, edit them to better fit your context, and add your own. Scoring sheet Provide evaluators with a scoring sheet to use during the evaluation. Location Issue Heuristic Severity Recommended action (optional) Where was the issue noticed? Describe the issue that you noticed Which heuristic does it violate? How bad is it (0-5)? 0 - not a problem 5 - catastrophic, show stopper. Suggest how to rectify this issue D4.1 Pilot workshops and enactments Design Package 23

D4.1 Pilot workshops and enactments Design Package 24

MTC heuristic evaluation (Heuristic Evaluation) Started on 24 October 2013 by Rebecca Galley Latest revision on 24 October 2013 by Rebecca Galley 2 revisions METIS PROJECT MCT heuristics Heuristic Evaluation of e-learning: <LdS> See the support document for instructions for using this template. Heuristic evaluation is a technique borrowed from usability research, where a group of experts is asked to assess a particular design using a given rubric (set of heuristics). It offers a low-cost rapid evaluation which often uncovers design flaws at an early stage. Thank you for agreeing to help us evaluate <project name>. Today you will be asked to put yourself in the learner s seat, review the project, and note any design flaws you perceive. Introducing <LdS> Add a paragraph or two introducing your LdS: what are you creating, who is the target audience, what do you aim to achieve? Your task Instruct the evaluator how to perform the evaluation. Tell them what role they assume, and what actions they should take. Often, the instructions suggest that the evaluator first does a quick pass through the evaluated system, then goes through it systematically and monitors the heuristics (below). You might want to suggest that evaluators take a quick note of the design flaws they notice, then return to their score sheet at the end and fill in the gaps. The Heuristics Present the evaluator with a set of heuristics to apply as they review your LdS. Review the heuristics sets in the appendix. Select the ones that are relevant to your LdS, edit them to better fit your context, and add your own. Scoring sheet Provide evaluators with a scoring sheet to use during the evaluation. Location Issue Heuristic Severity Recommended action (optional) Where was the issue noticed? Describe the issue that you noticed Which heuristic does it violate? How bad is it (0-5)? Suggest how to rectify this issue D4.1 Pilot workshops and enactments Design Package 25

Activity quite subjective How to students know what sort of things to contribute in evaluation of card (level 3) Explaining why something is an improvement is hard to measure Good Tutor understands role and understands the online environment (receives training) Good scaffolding Must be relevant to the learning outcomes Very clear explanation in advance about why they are asked to do it Advance warning about time and duration 0 - not a problem 5 - catastrophic, show stopper. Make sure tutors refer students back to the brief Get students to respond to criteria in the objective Clarify outcomes No changes D4.1 Pilot workshops and enactments Design Package 26

Christmas card (Web Collage) Revision 2 by Jane Collinson on 24 Oct 2013 17:05 Revision 1 by Jane Collinson on 24 Oct 2013 15:58 Christmas card General information: Title: Christmas card Prerrequisites: Debate, exchange, agree: Be able to explain why something is an improvement: Discuss and improve on a design that has been provided: Learning activity flow: Pyramid Level 1 Student Discuss brief and provided design Students to add comments on brief and how the template card provided matches it - or doesn't. Forum discussion of comments re brief and card Teacher Introduction and objectives Clear guidelines. Time to take. Provide incomplete design Incomplete Christmas card Provide brief for card Level 2 group Teacher Level 2 Use forum comments in redesign of card Students to produce another version of card using comments and also module content and own research Take photo of new design and upload to ODS Moderates Students design card and take photo Comment on other students' redesigns in ODS Level 3 Class Relate design to brief, research and module content Teacher Short report reflecting on relationship and best fit of design to brief. Assessed? Support activity 3 Agree on and produce final design D4.1 Pilot workshops and enactments Design Package 27

Dicuss, collaborate,decide Submit all teams' selected designs to 'expert' For independent choice of final design. Plus a motivation as 'real world'. Assessment plan: D4.1 Pilot workshops and enactments Design Package 28

FBL collaborative activity (Web Collage) Revision 1 by Martin Friel on 24 Oct 2013 15:31 FBL collaborative activity General information: Title: FBL collaborative activity Prerrequisites: Know how to analyse a workplace issue using a variety of theoretical perspectives: Learning activity flow: Pyramid Level 1 Student Individual reading of case study and your allocated theory Teacher Support activity 1 Guide students to resources (which include allocations) and welcome queries on case study/theories/process. Level 2 Level 2 group Share analysis of case study in small groups and write agreed analysis * Add individual outputs to forum * Read other group members' contributions * Make comments on other contributions * Compile ideas to form group response Teacher Monitoring Tutor intervenes if students can't resolve differences or go way off track, explaining reasons. Level 3 Level 3 group Teacher Students recombined in sub-groups to argue for their technique Support activity 3 Level 4 Class Teacher Level 4 Support activity 4 Assessment plan: D4.1 Pilot workshops and enactments Design Package 29

FBL collaborative activity revised (Web Collage) Revision 1 by Martin Friel on 24 Oct 2013 15:57 METIS PROJECT FBL collaborative activity revised General information: Title: FBL collaborative activity revised Prerrequisites: Learning activity flow: Jigsaw Individual phase Individual reading work Teacher Individual reading Support individual study Individual reading of case study and your allocated theory, and application of theory to case study. Expert phase Expert group Teacher Subproblem discussion Discussion support Jigsaw phase Individual reading work Teacher Global discussion Solution proposal Discussion support Assessment plan: D4.1 Pilot workshops and enactments Design Package 30

HSC WebCollage prototype (simulation) (Web Collage) Revision 2 by Beccy Dresden on 24 Oct 2013 15:38 Revision 1 by Beccy Dresden on 24 Oct 2013 15:28 METIS PROJECT HSC WebCollage prototype (simulation) General information: Title: HSC WebCollage prototype (simulation) Prerrequisites: Increase awareness of diversity and challenge limits to knowledge: Share different experiences: Co-create a shared presentation to communicate range of group views on diversity and older people: Learning activity flow: Simulation Role information Students Student-generated case studies Research the media to find real-life examples of older people who are as different as possible to the stimulus material (stereotypes), summarise your findings in 250 words or less and post your summary to the forum. Teacher Support activities (if req'd) Suggest possible resources (online or offline) to students who aren't sure where to look. Offer guidance on searching. Small groups of students Teacher Role definition Choose a character and adopt their persona Debate the question 'what is the most important factor in your quality of life?' while in character. Consolidate findings Support activity Encourage students to make explicit links to theory previously encountered in study material when answering the question. Tutor role Agree key findings from the discussion as a series of bullet points identifying what the characters disagreed about. Ensure that students focus on differences not similarities - we're looking for diversity not consensus! Simulation groups New group Teacher Prepare simulation Support activity 3 Simulation Students Teacher D4.1 Pilot workshops and enactments Design Package 31

Simulation Support activity 4 Assessment plan: D4.1 Pilot workshops and enactments Design Package 32

Intervention and Reframing: Using Diagramatic Thinking (Web Collage) Revision 3 by Eloy Villasclaras Fernandez on 24 Oct 2013 16:10 Revision 2 by Eloy Villasclaras Fernandez on 24 Oct 2013 09:44 Revision 1 by Eloy Villasclaras Fernandez on 24 Oct 2013 09:43 Intervention and Reframing: Using Diagramatic Thinking General information: Title: Intervention and Reframing: Using Diagramatic Thinking Prerrequisites: None Discussion & Collaboration: Encourage discussion and collaboration between students & students and students & teachers Diagrammatic thinking: Prompt students & teachers to engage in diagrammatic thinking (e.g. reasoning through the use of visual representations/diagrams). Reframming: Engage in reframing (e.g. this is where the teacher/workshop presenter initiates an intervention that leads students (and teachers when applicable) to reframing ) Collaborate to construct new solution: Assist students and teachers to use diagrammatic representations to construct new analogical solution/ideas, to be used to solve a problem or task at hand Re-mapping: This means to abstract from a given diagram s elements, and re-apply them onto a problem or task, through the use of analogical reasoning. After the diagram s elements are identified, analogies are built based on these elements. These analogies are fuelled by an effort to map the elements on to the challenge at hand. Learning activity flow: Pyramid Level 1 Level 1 group Discussion: What does the image depict? Students get into pairs (20 student in 10 pairs) and they discuss what they see in the image. They collaborate by listening to each other's ideas and collectively attempting to describe what they see in the image. Teacher Support activity 1 This activity is to foster dialogue and collaboration, Instruct student to get into pairs. Then provide access to the Pyramid image. Ask students: Can you describe what you see What do you think the image is a depiction of? Level 2 group Teacher Level 2 Deeper discussion of the pyramid image Students work together in groups of 4 to answer the 6 questions presented by the teacher about the diagram. They can do this collaboratively on Google Document that also lists the 6 questions. Reframing Students are asked to choose to discuss the new images: the moebius strip, atom or hive, to answer the teacher's Support activity 2 Teacher has the students move from pairs to groups of 4 and presents them with the pyramid questions. Teacher instructs them to quickly discuss (face-to-face) and answer questions using the google document provided. If the students have not already come to some conclusion that this represents society or that an inequity is visible, the teacher can ask: "Does the image represent equality?" "If you were a person in the image, based on 'reality' where do you think you might be on the pyramid? Is that where you want to be?" Each group answers the 6 questions on the same google doc. If a group finishes early, they can read what the other groups are writing, but the teacher D4.1 Pilot workshops and enactments Design Package 33

directions: "Describe a new form of society based on one of the diagrams so that it is more equal." instructs them to stick with their original answers for the time being. Intervention The task they are given is to describe a new form of society (or basic political/social structures) based on one of the diagrams and make it more equitable. Level 3 group Teacher Level 3 Students get in slightly bigger groups (2 groups of 6 and 1 group of 8) Discussing the 3 diagrams, student collaborate and choose only one and discuss why they think it would be good for a new societal structure that is more equal. Support activity 3 Diagrammatic Thinking Level 3 Hard choices? Students collaborate to provide a written rationale for their choice of digram to reorganise society to make it more equal. Here they are using Analogy: he use of analogy allows them to move from one object (which can be an item, a person, a situation or a concept) to another, on the basis of some usually direct or structural equivalence/similarity, guided by the purposes to which the user is committed to (e.g reorganising society to make it more just). This process has been linked to creativity and creative thinking. Here the students are essentially engaged in a co-creative activity. Together they must debate how the society would be reorgnased using the diagram. This they must record this collabroative work in a Google Drawing (provided by the teacher based on what image they choose). This is the intervention. It engages students in diagrammatic lateral thinking (DLT). The teacher will circulate and ask students to think carefully about why they are making the choice they are and to come to a decision collectively and through negotiation. Here, the diagram, through its use (reframing) serves as a vehicle of cognitive processes, embodying various aspects of a problem (restructuring society). The agent s (students) mind is extended onto the diagram and reasoning, proceeds through structural rather than semantic or syntactical entailment. One therefore thinks through the diagram rather than use it as a simple prop. Level 4 group Teacher Fostering 're-mapping' based on students' reframing Level 4 Presentation The three groups of student then present their rationales for the choice of diagram for reorganising society and present their rationale via the Google Drawing they have collaborated on. They provide their analogies and articulate their choices. Here the teacher is to act as a facilitator and make sure each of the three groups have time to present their rationales. Chances are, there will be more than one image chosen, but in the case of 2 groups or 3 groups choosing the same image, the teacher needs to facilitate the discussion in a way that build consensus as to what elements (drawing on the rationale) as most sound and possible... The goal here it to make sure the students are able to remap. This means they are guided to abstract from a given diagram s elements (through the teachers intervention), and re-apply (reframe) them onto a problem or task (making society more equitable), through the use of analogical reasoning. After the diagram s elements are identified, the teacher helps the students to articulate analogies, and then these are built based on these elements and included in students' collaborative rationales. Student's analogies are fuelled by an effort to map the elements on to the challenge at hand. Level 5 Class Concensus (whole class) At this point, a whole class discussion begins where the students must reach consensus or have strong enough rationales to support more the on societal reorganisation based on one of the 3 diagrams. Teacher Support activity 5: Facilitation of the 're-mapping' Here the teacher makes sure students have used analogy to build their rationales. Here the students use of analogy stemming primarily from its fluidity in managing equivalence/similarity, illustrates how it can be applied on any kind curricularcontent becasue it can be used (with diagrams specifically) to disrupt ramified patterns of thought D4.1 Pilot workshops and enactments Design Package 34