• Không có kết quả nào được tìm thấy

IEA International Computer and Information Literacy

N/A
N/A
Nguyễn Gia Hào

Academic year: 2023

Chia sẻ "IEA International Computer and Information Literacy"

Copied!
77
0
0

Loading.... (view fulltext now)

Văn bản

The report referred to “the extent to which ICT skills are included in the curriculum” and focused on both computer science and CT (OECD 2016a, p. 18), pointing to cases such as Sweden (where new curricula are being introduced in school and teacher training) and Spain (where ICT was incorporated into school curricula as part of a wider national digital strategy). 2012) reviewed existing definitions of ICT literacy and argued that they related to the ability to access, evaluate, manage and use information, as well as the efficient application of technology (e.g. the effective use of applications and devices).

Computational thinking

PACT is based on design patterns for larger CT practices and involves judging the quality of the instructions (or coding steps) that have been assembled. Students' codes were assessed by the research team based on sets of rubrics that reflected elements of CT.

Recent education policy developments related to CIL and CT

Zhong et al. 2016) developed a three-dimensional evaluation framework based on the concepts of direction, openness, and process. The assessment included three pairs of tasks that were based on a three-dimensional programming language: (1) forward closed tasks and reverse closed tasks, (2) forward semi-open tasks and reverse semi-open tasks, and (3) open tasks with a creative design ratio and open tasks without a creative design ratio.

Research on the use of digital technologies in learning

The authors of the report also indicated that students were more confident in their “digital competences when they had a high level of access to/use of ICT at home and at school” (European Commission 2013, p. 15). Greater interest in and enjoyment of ICT use was associated with higher CIL scores in nine of the 14 countries that met the ICILS sampling requirements.

Research questions

Computer and information literacy

Secondary analyzes of ICILS 2013 show that teachers' attitudes are related to the extent and ways in which they use ICT in teaching (Drossel et al. 2017a; Eickelmann and Vennemann 2017). Indeed, there is evidence that school factors, including the collaborative use of ICT among teachers, can shape the pedagogical use of ICT for instructional purposes (Drossel et al.

Computational thinking

RQ CIL 4 What aspects of students' personal and social backgrounds (such as gender and socio-economic background) are related to students' CIL.

Participants and instruments .1 Participants and sampling

Instruments

A school principal questionnaire administered to the principals of selected schools and designed to capture the characteristics of the school, the use of ICT in teaching and learning and aspects of the management of ICT in the school. Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0, which permits any non-commercial use.

Computer and information literacy framework

  • Overview
  • Defining computer and information literacy
  • Revising the structure of the computer and information literacy construct
  • Structure of the ICILS 2018 computer and information literacy construct
  • Strands and aspects of computer and information literacy .1 Strand 1: Understanding computer use
  • Foundations of computer use
  • Computer use conventions
    • Strand 2: Gathering information
  • Accessing and evaluating information
  • Managing information
    • Strand 3: Producing information
  • Transforming information
  • Creating information
    • Strand 4: Digital communication
  • Sharing information
  • Using information responsibly and safely

The change in structure does not mean a change in ICILS assessment content nor does it presuppose a change to the analytical structure of the CIL construct. The importance of access to information and evaluation is also a direct result of the increasing amount and range of available unfiltered computer-based (and delivered) information.

Figure 2.1: ICILS 2018 CIL framework
Figure 2.1: ICILS 2018 CIL framework

Computational thinking framework

  • Overview
  • Defining computational thinking
  • Structure of the ICILS 2018 computational thinking construct
  • Strands and aspects of computational thinking .1 Strand 1: Conceptualizing problems
  • Knowing about and understanding digital systems
  • Formulating and analyzing problems
  • Collecting and representing relevant data
    • Strand 2: Operationalizing solutions
  • Planning and evaluating solutions
  • Developing algorithms, programs and interfaces

For ICILS, the definition and explanation of CT, as for CIL, should be considered in the context of the ICILS assessment parameters. Knowing and understanding digital systems refers to a person's ability to identify and describe the properties of systems by observing the interaction of the components within a system. The student should follow the actions of the players and the resulting results according to the specified rules and conditions of the game.

The strand includes an understanding of the users' needs and their likely interaction with the system under development. Designing components of a solution taking into account system limitations and user needs.

Contextual framework

Overview

Classification of contextual factors

Antecedents are exogenous factors that condition the ways in which CIL/CT learning takes place. They are contextual factors that are not directly influenced by the variables or outcomes of the learning process. The single-headed arrow between antecedents and processes indicates the assumption within the ICILS contextual framework of a unidirectional influence between these two types of contextual factors.

The student questionnaire will primarily collect data on contextual factors relating to the individual student's level and his or her home context. The questionnaires for the teacher, head teacher and ICT coordinator are designed to locate contextual factors associated with the school/classroom level, while the national context survey and other available sources (e.g. published statistics) will collect contextual data at the level of the wider community.

Figure 4.1: Contexts for ICILS 2018 CIL/CT learning outcomes
Figure 4.1: Contexts for ICILS 2018 CIL/CT learning outcomes

Contextual levels and variables .1 The wider community context

  • School/classroom context
  • Home context
  • Individual context

Teachers' sense of self-efficacy in using basic ICT has been reported to be linked to greater use of ICT in the classroom (Hatlevik 2016; Law et al. 2008). Furthermore, ICILS 2013 reported that "older teachers typically had less positive views than younger teachers about the use of ICT and expressed lower confidence in their ability to use ICT in their teaching practice" (Fraillon et al. 2014, p. 257). The ICILS 2018 teacher questionnaire will therefore collect information about the background of their teaching staff in the use of teaching staff (gender years and numbers) ICT for teaching purposes, general use of computers in different places, participation in ICT-related professional development activities, and perceived confidence in the use of ICT for different tasks). Results from ICILS 2013 indicated that teachers across participating countries tended to recognize positive benefits of using ICT in teaching (Fraillon et al. 2014).

Results from the 2013 ICILS showed that, in participating countries, socio-economic background consistently explained significant variation in students' CIL (Fraillon et al. 2014). The 2013 ICILS results showed that students who expected to complete a university degree also had higher levels of CIL (Fraillon et al. 2014).

ICILS instruments

Test instrument overview

  • Test interface

First, it provides students with information about their progress during the test (such as the number of completed and remaining tasks and time remaining). The test interface includes navigation buttons that allow students to navigate between tasks, and an information button that allows students to access general test administration information and task-specific information, such as scoring criteria or detailed task instructions. The stimulus area is a space that contains non-interactive content, such as an image of a website login screen, or interactive content, such as electronic documents or live software applications.

The test interface and stimulus area were similar to those used in ICILS 2013, but their appearance was modernized for ICILS 2018. The location and functionality of the elements of the test interface (such as the navigation button and task progress indicator) were not changed, but the appearance of the elements was modernized to conform to 2018 interface design conventions.

The ICILS test instrument design

  • CIL test modules
  • CT test modules
  • Test module rotation

Each module has a unifying theme and consists of a series of tasks related to the theme, but, unlike the CIL modules, the tasks within the RT modules are not directly related to the development of a major task. Further tasks are related to the use of simulations to collect data and draw conclusions about real situations that can determine the planning of the development of a computer program. The difficulties of the tasks are related to the variety of code functions available and the complexity of the sequence of actions required by the drone to complete the task.

20 possible permutations of the two CIL modules have been selected from the five available modules. There are two permutations of the two CT modules and each student is randomly assigned one module permutation.

Types of assessment task: CIL

  • Task type 1: Information-based response tasks
  • Task type 2: Skills tasks
  • Task type 3: Authoring tasks

Similarly, sample task 2 (Figure 5.3) requires students to examine a non-interactive email (in this case a suspected phishing email) and to respond using free text in a text input box in the lower portion of the test interface. The dynamic computer-based environment in example task 1 (Figure 5.2) allows students to view each of the four website structures in turn. The task is therefore an example of a non-linear skill task that requires information processing skills and relates to Aspect 2.2 (management of information) of the CIL construct.

The assignment is scored manually based on the accuracy with which the various specified elements of the garden design are shown on the diagram. Scoring criteria can be categorized as relating to the student's use of (1) software features and (2) available information.

Figure 5.2: Example task 1 (a typical multiple-choice task)
Figure 5.2: Example task 1 (a typical multiple-choice task)

Types of assessment task: CT

  • Task type 4: Nonlinear systems transfer tasks
  • Task type 5: Simulation tasks
  • Task type 6: Visual coding tasks

Simulation tasks such as example task 8 are typically related to aspect 1.3 (collect and represent relevant data) of the RT construct. Accordingly, in this section we only provide a description of the features of the visual coding task interface and the types of visual coding tasks that students may complete in the ICILS CT test. The facility for students to run the code at any time and to see the resulting behavior of the drone while the code is running.

The ability to reset the code in the workspace (to the default state for each task) and to reset the starting position of the drone before executing the code. These tasks relate to aspect 2.2 (development of algorithms, programs and interfaces) of the ICILS CT framework.

Figure 5.8: Example task 7 (nonlinear systems transfer task)
Figure 5.8: Example task 7 (nonlinear systems transfer task)

Mapping test items to the CIL and CT frameworks

Algorithm construction tasks require students to develop their own solution to a problem by iteratively adding blocks of code to the workspace and running the algorithm to see the results. Students' answers are judged on the accuracy with which the code achieves the specified objective, as well as the efficiency of the code, taking into account the number of code blocks used and the students'. In these tasks, the students are presented with an existing set of code blocks in the workspace, a description of the intended outcome of executing the code, and an indication that the code is not working and needs to be fixed.

Students are free to change the code and also reset the code blocks in the workspace to the default state of the task (ie, re-displaying the original incorrect code that requires debugging). Students are assessed on how closely their specified solution matches the ideal solution, including how precisely the code meets the specified goal.

Table 5.2: Mapping the CT test items to the CT framework
Table 5.2: Mapping the CT test items to the CT framework

The ICILS student questionnaire and context instruments .1 Student questionnaire

  • Teacher questionnaire
  • School questionnaires
  • National contexts survey

Retrieved from http://cctd.au.dk/currently/news/show/article/computational-thinking-and-design-to-become-a-mandatory-part-of-curriculum-in-danish-primary-school/. Retrieved from https://ec.europa.eu/digital-agenda/sites/digital-agenda/files/KK-31-13-401-EN-N.pdf. School-level predictors for the use of ICT in schools and the CIL of students in international comparison.

Retrieved from http://assets.pewresearch.org/wp-content/uploads/sites/2/2015/03/Pew-Research-Center-Technology-Report-FINAL-March-19-20151.pdf. Retrieved from http://www.worldbank.org/en/publication/wdr Computer science and computational thinking in the curriculum: research and practice.

Ekaterina Mikheeva, Deputy Head of International Data at ICILS Sabine Meinck, Head of Research, Analysis and Sampling Unit Sabine Tieck, Research Analyst (Sampling). Adeoye Oyekan, Research Analyst Christine Busch, Research Analyst Alena Becker, Research Analyst Hannah Köhler, Research Analyst Wolfram Jarchow, Research Analyst Lorelia Nithianandan, Research Analyst Rea Car, Research Analyst. Clara Beyer, Research Analyst Dirk Oehler, Research Analyst Tim Daniel, Research Analyst Yasin Afana, Research Analyst Guido Martin, Head of Coding Unit.

They provided policy- and content-oriented advice on the development of the instruments and were responsible for the implementation of ICILS in the participating countries. INVALSI, National Institute for Educational Evaluation of Instruction and Training Gemma De Sanctis (through May 2018).

Hình ảnh

Figure 2.1: ICILS 2018 CIL framework
Figure 4.1: Contexts for ICILS 2018 CIL/CT learning outcomes
Table 4.1: Mapping of variables to contextual framework (examples)
Figure 5.1: Test environment comprised of two functional spaces
+7

Tài liệu tham khảo

Tài liệu liên quan

Nếu cô giáo chia mỗi nhóm có đúng 4 hoặc 5 học sinh thì đều còn dư 1 học sinh, nếu cô giáo chia mỗi nhóm có đúng 3 học sinh thì còn dư 2 học sinh?. Trong 6 giấy