This document contains the findings of the analysis of the features recommended for an ideal, feasible system for certification of digital skills in Europe, as defined by the CEPIS Digital Skills Policy (CEPIS DSP) expert group.

They have been developed within the context of the European Digital Skills Certificate (EDSC) feasibility study, in which CEPIS takes an active part. Digital skills and their certification have been a priority area for CEPIS since its inception, and this is a very important topic for the CEPIS member societies. Thus, these recommendations present the considered view of the CEPIS community.


The CEPIS DSP expert group considers the DigComp framework for digital competence of citizens a good reference for the alignment of the efforts of the European Union regarding digital skills and competence.

  • Therefore, the certification system should be aligned to the DigComp description of digital skills and competences, covering the different areas and competences as much as possible.

According to the description of the framework DigComp version 2.2, ‘digital competence’ is a complex concept and includes many different aspects. Of course, some aspects might be more important than others for some users, depending on the context where or when they need to use digital resources.

  • Therefore, an ideal, feasible system should offer a modular structure of tests that would enable each user to decide which aspects of digital competence they want to certify. The most flexible option could be an individual certification of each of the 21 competences described in DigComp version 2.2., or at least blocks/modules of a few competences collated logically where possible.

Although the description of the DigComp framework is clear and complete, especially with the help of the 250 examples of knowledge, skills and attitudes of DigComp version 2.2, it is far from being detailed enough to directly describe the specific points to be checked in a possible test of competence.

  • Therefore, if the tests (e.g., from an existing certification provider) need to show how they map against DigComp, they should express the points of their programs in terms more similar to and compatible with the descriptions of items in DigComp.
  • As the DigComp version 2.2 examples and descriptions follow a style similar to learning outcomes (e.g., describing examples of what the candidate should be capable of achieving in each competence), an ideal digital skills certification system should express a detailed set of capacities to be checked with the system for each competence in terms of knowledge, skills and attitudes.
  • As commented in the CEDEFOP Handbook on Defining, writing and applying learning outcomes, the style of learning outcomes could be customised to assessment specifications and qualification by expressing “criteria, using learning outcomes statements, often formulated as threshold levels which must be met by the candidate”, in the style of a summative assessment. The use of appropriate verbs and expressions to clearly describe what the candidate needs to show or achieve during tests is essential. Describing clear and measurable criteria and levels of achievement and thresholds for qualification in a transparent and open way is a must. Furthermore, they should always be linked to the corresponding items stated in or derived from the DigComp version 2.2 examples and descriptions.

As the idea of DigComp is the digital competence, the existing definitions of competence should guide the way EDSC checks the digital competences of candidates. The CEDEFOP Glossary of European education and training policy defines ‘competence’ as follows: “[a] demonstrated ability to use knowledge, know-how, experience, and – job-related, personal, social or methodological – skills, in work or study situations and in professional and personal development.” In addition, as commented in the glossary, “competence is not limited to cognitive elements (involving the use of theory, concepts or tacit knowledge); it also encompasses functional aspects – including technical skills – as well as interpersonal attributes (e.g. social or organisational skills) and ethical values”.

In the case of the Standard EN 16234:2019, the definition of competence is “a demonstrated ability to apply knowledge, skills and attitudes for achieving observable results”.

Finally, the DigComp version 2.2 defines competences as “a combination of knowledge, skills and attitudes, in other words, they are composed of concepts and facts (i.e. knowledge), descriptions of skills (e.g. the ability to carry out processes) and attitudes (e.g. a disposition, a mindset to act)”.

  • Thus, the ideal EDSC should verify knowledge, skills and attitudes (also considering interpersonal attributes) in real situations and contexts. According to general mechanisms for evaluation of these types of items, it is acceptable to use questions (e.g., multiple-choice questions) for checking knowledge. However, testing skills would require performing practical tasks and, in the case of competence, aiming to perform the tasks as much as possible in real contexts. For example, the tests should avoid using simulations, instead requiring the candidate to use real tools in realistic context to check if achieved results are correct, and sometimes, the adherence to specific recommended processes.

Regarding aptitudes or personal skills, it is difficult to check the candidate’s behaviour in real situations unless they are observed in any such specific situations along time. As techniques like observation would be difficult to implement in practice, the system should at least query the candidates on their reaction to specific described situations (e.g., give them possible choices of actions or reactions after presenting a scenario). This is a limited approach but it is also better accepted than others that might be considered more intrusive. This would also be less controversial regarding ethical boundaries, such as whether to address personality traits which can be considered ‘private’. Feasibility is a must for EDSC, therefore:

  • the systems should rely on automatic tests that do not require human intervention for evaluation of results, thus making them less expensive and avoid any potential unconscious bias on part of the evaluator. Thus, options that are more sophisticated are discarded. Of course, human presence would be desirable to invigilate and manage processes but not for performing or scoring the assessment.
  • The system should be as conceptually simple as possible in terms of conditions for candidates who want to pursue a certificate of their digital competence. This is why it is recommended to offer candidates the possibility of remote testing with their own technical equipment (with minimal, very affordable set of conditions), to overcome possible geographical barriers. Of course, candidates still should have access to the option of going to a suitably accredited centre with proper equipment and a test proctor being present, to take tests for those not capable of using the remote option, e.g., because they cannot fulfil the minimum hardware and/or software conditions. Conditions for remote assessment must strictly guarantee the expected conditions to check the identity of the individual taking tests as well as the fairness of the process, avoiding non-authorised methods, use of information or assistance, under well-defined quality assurance procedures and mechanisms.

Evidently, the EDSC aims at being a universal pan-European proposal for certification of digital skills. Therefore:

  • it should be fully and completely compliant with some of own regulations promoted by the EU for a more inclusive society, especially to avoid any type of digital divide. Apart from the previous call for simplicity of conditions, digital accessibility is key for “the right of persons with disabilities and of the elderly to participate and be integrated” in the digital world (EU Directive 2016/2102).
  • it should be compliant with the European Norm EN 301 549 v3.2.1 (2021-03) and aligned with the following two directives: the EU Directive 2016/2102, which has already entered into force and has finally been transposed to national legislation in all Member States, and the EU Directive 882/2019 to be applied to many products from 28th June 2025 onwards. As a consequence, the EDSC should promote test systems for compliance with EN 301 549 v3.2.1 (2021-03).
  • The EDSC should be independent from training on digital competence in several ways:
    • The possibility of taking tests should not be contingent upon prior training: digital competence can be acquired in many different ways and the certificate should be only focused on showing the digital competence, not the way how it was acquired or developed. The aim is similar to summative assessment.
    • To avoid potential bias, the persons in charge of managing and invigilating the tests should be independent from those in charge of training. It does not mean that a test provider cannot offer training to acquire digital competence and to pass the test (as happens in many other certifications, from driving licences to certification of competence in foreign languages). However, it must not disclose confidential information on actual tests.

A system for certification of digital skills should not only be a quality IT-based solution or platform for testing the digital competence or capacity of a candidate. The quality of the process should also be assured to enable a transparent, consistent and quality-focused method that can help ensure the validity and equality of certification test results. Independent certifications of QA systems like ISO 9001 for the test systems and providers might play this role of assurance of processes. Other relevant existing standards like ISO 17024 could be recommendable, at least as a source of guidelines for a proper process of certification of capacities of individuals.

  • It is important that the issued digital evidence or document of certification provides all the necessary details, such as, for example:
    • Full information on the item that is being certified (competencies, areas of competencies, modules, etc.,
    • Version of DigComp to which this test is mapped,
    • Version of the program and test system of the certification provider
    • Date of the test(s) or of issuing of the certificate
    • Technical context used during the tests (e.g., name and version of applications, browsers, etc. used as reference for tests), etc.
    • Having detailed information on the conditions and context of the test enables interested third parties (e.g. employers) to determine the validity and timeliness of the certification without needing to implement an official certification expiration date. The management of official expiration date could represent a big effort both for certification providers and for certified individuals, without adding any relevant value over the previously commented automatic consideration of time validity of certifications when detailed information of conditions is available.
  • As part of quality, the trustworthiness of certifications issued under the scheme of the EDSC proposal should rely on modern systems like digital badges with verification mechanisms or possibly, in the future, linked to blockchain. These mechanisms must allow for an easy confirmation of the validity and details of a certification when the holder shows it or allows the access to such information. The possible connection to the microcredentials system in the EU is perceived, at the time of writing, as being hindered by the bureaucracy associated with the process, but it is an option that should be explored.

Finally, it is relevant to note that the type of very advanced activities related to proficiency levels 7 and 8 in some of the competences of DigComp closely resemble some of the lowest level activities that some ICT professionals have to do during their daily activities. Thus, it is important to highlight that the fact that a user is capable of doing some of those low-level activities also developed by professionals in a more or less appropriate way does not represent evidence of being qualified as an ICT professional. The EDSC proposal should consider this possible blurred border between advanced digital competence of users represented in levels 7 and 8 of some competences of DigComp and some of the lowest professional competences described in the standard EN 16234:2019 (that, in fact, has an Annex for considerations of its relation with DigComp). A previous experience exploring the links between DigComp and EN16234:1 was already developed within the frame of the European project EU4D where staff and managers from CEPIS, as well as experts selected by CEPIS, participated as advisors.


Why not keep up to date with all our latest news and events?