«Ascribing meaning to the standards and guidelines for quality assurance in the European higher education area in Portugal Paper presented in track 1 ...»
The data was collected through a census (the link to the online questionnaire was sent to all Portuguese HEIs, asking them to distribute it among their academic staff) and a total of 2.191 valid answers were obtained (from a total population of 34.488, according to the data provided by the REBIDES 2010 – the inquiry to the higher education academic staff record). To establish a representative sample of the population, it was decided to ‘weigh’ the sample considering three variables: higher education subsystem; gender; and research area. After weighing, the sample consists of 2.099 academics from 15 public universities, 23 public polytechnics, 20 private institutions offering university degrees and 29 private polytechnics spread all over the country. With regards to the respondents, most of the sample consists of male academics (55.2%, 1,158), aged between 36 and 45 years (33.7%, 699), and between 46 to 55 years (36.8%; 762). Most academics are from Social Sciences (38.0%; 797) and from the polytechnic public subsystem (37.7%; 791). Most academics have a doctorate (52.8%; 1,087). The majority of academics are assistant professors (47.4%; 500, in the university system, 47.5%; 474 in the polytechnic subsystem) (see Table 1). Although these variables characterising academics are pertinent to explain academics’ perceptions on quality assurance (see Cardoso, Rosa, and Santos 2013; Manatos, Rosa and Sarrico, 2015), they were not considered for the present analysis.
For the purpose of this paper the answers obtained for the degree of implementation of the seven ESG Part 1 standards in Portuguese HEIs were used (see in Table A1, in the annex, the questions designed to operationalise each one of the standards). Firstly, a descriptive analysis was undertaken.
Secondly, a factorial analysis using an oblique rotation (Promax) was conducted to investigate whether the variables intended to operationalise the seven ESG Part 1 standards in the questionnaire Ascribing meaning to the standards and guidelines for quality assurance in the European higher education area in Portugal could be grouped. Thirdly, an analysis of the factor solution was developed, and some relevant descriptive statistics were computed for each one of the implementation subscales identified.
Main results Overall, the implementation of ESG Part 1 standards and guidelines in Portuguese HEIs is quite significant (all mean scores are equal or higher than 4.0 and almost all medians are around 6).
Nevertheless the results also point out that there is some deficit (medians of 4 and 5) concerning mechanisms capable of assuring teaching staff quality, external stakeholders’ auscultation and an information system sufficiently broad in scope and capable of effectively promoting institutional selfassessment.
Trying to uncover links between the different practices included under each standard (the guidelines), we have used exploratory factor analysis to identify the set of factors which could explain most of the variance observed in academics’ answers and to see how far these factors represent the seven ESG Part 1. The factorial analysis performed on the data revealed six factors with eigenvalues higher than 1 (corresponding to 69% of total variance explained), resulting in an ESG implementation scale consisting of 63 items (the sentences designed to operationalise the ESG standards), and six subscales each one corresponding to one of the six extracted factors (see Table 2 and Table A2 in the annex).
The items for each subscale were selected based on their loadings on each one of the extracted factors; all loadings being higher than 0.3 (according to Hair et al. (1998), this is the threshold value when sample sizes are higher than 350). Whenever an item presented loadings higher than 0.3 in more than one factor, the decision was to allocate it to the factor where the loading was higher.
Table 2 presents the statistics computed for each one of the subscales. These subscales are formed by combining all items loading highly on the factor and using their mean as a replacement variable – the subscale (Hair et al., 1998). Internal consistency for the six subscales ranged from 0.93 to 0.96.
Ascribing meaning to the standards and guidelines for quality assurance in the European higher education area in Portugal Table 2. Academics’ perceptions on the degree of implementation of the ESG practices (guidelines).
Correlations between the subscales varied between 0.65 and 0.74 (Pearson’s r), and all were significant at the 0.01 level (2-tailed), which allowed the conclusion that the six subscales are highly correlated (see Table A3 in the appendix). The t-tests for paired samples revealed a mix of not statistically and statistically significant differences (for a 2-tailed significance level of 0.05) among the mean scores of the subscales (considering all possible combinations). This means that academics’ perceptions on the degree of implementation of the subscales tend to be the same at least for some of them.
Ascribing meaning to ESG Part1 implementation One of the interesting things when doing an exploratory factorial analysis is the fact that we start from the data to try to uncover links among a set of variables, instead of pre-determining subsets of variables. In this case we started from the Portuguese academics’ perceptions on the degree of implementation of the ESG Part1 seven standards in their own institutions and ended up with six subscales grouping quality assurance practices derived from these standards guidelines.
Each one of the subscales was then tentatively named based on the higher number of related practices included in it and taking into consideration the variables with highest loadings (Table 2).
Nevertheless this was not an easy task because each subscale presents a mix of different practices and it is difficult to find in it a common denominator. Subscale SS1 contains mainly variables reflecting students’ assessment practices and as such was named Students Assessment. In subscale SS2 a set of practices related to the quality assurance of teaching staff are put together, as well as practices related to the institution’s self-knowledge with the goal of continuously improving and practices related to the participation of external stakeholders in quality assurance – it was then named Teaching Staff, External Stakeholders and Self-Knowledge. Subscale SS3 groups a set of variables translating quality assurance practices at the level of the resources and support services available for teaching and learning, as well as a set of variables related to the information provided by the institution both internally and externally – we then opted for the designation Teaching and Learning Resources (assuming the broad perspective that resources includes also information resources). As for subscale SS4 all the variables included in it translate the issues the policy statement for quality assurance should include – we then named it Policy Statement for Quality Assurance. Subscale SS5 includes only variables reflecting quality practices related to information collection and analysis – we then opted to name it Information Systems. And finally subscale SS6 groups a set of different practices that go from the type and quality of the public information published by the institution to the existence and continuous improvement of physical resources and support services for students learning – in this case we were unable to find an adequate name and let it as Undefined (Hair et al.
Ascribing meaning to the standards and guidelines for quality assurance in the European higher education area in Portugal When opting for the exploratory factorial analysis to treat the collected data, our expectation was, at least to a certain extent, that seven factors would emerge from our analysis, each one of them resembling the ESG Part1 standard from where the variables used in the analysis had been derived.
But the fact is that only six subscales have emerged and these do not exactly match the standards proposed by ENQA. In fact, only subscales SS1 – Students Assessment, SS4 – Policy for Quality Assurance, and SS6 - Information systems, contain mainly variables reflecting the guidelines under ESG1.3, ESG1.1 and ESG1.6, respectively. The other three subscales tend to be a mixture of variables designed to operationalise different ESG. Furthermore, 18 out of the 63 variables have significant loadings in more than one factor.
These results may be a reflection that the guidelines proposed under each standard cover quality assurance practices that are very much related to a variety of different standards, despite being assigned to just one in the current formulation of the ESG. There are obviously other reasons that we can think of to tentatively explain the results obtained. One is the fact that we are exploring the links between the variables based on the perceptions Portuguese academics have on their implementation. And it may be the case that Portuguese institutions are indeed differently implementing individual practices from the seven standards without paying much attention to each standard as a whole. Other explanation may be the fact that we created too many variables when designing our questionnaire; we carefully looked into the text of the guidelines for each standard and the questions we drafted tried to cover as much as possible those guidelines, which may be too much to decide about the standard implementation as a whole.
Conclusion As far as we know this is the first attempt to look at the degree of implementation of the ESG Part 1 in HEIs, using a quantitative methodology approach. Moreover, the goal was not just looking at the degree of implementation per se, but also to try to provide some understanding regarding the underlying structure of the ESG Part 1, and eventually how it can be used effectively as a framework for the implementation of IQAS within HEIs. More work on the interpretation of the subscales has now to be done to better understand the underlying structure of the ESG and how it can be appropriated by HEIs.
One line of research regarding the above mentioned interpretation might be found on the relationship between how evolving institutional governance mechanisms are changing how quality is managed in higher education institutions (Sarrico et al. 2013), and eventually how the ESG might accommodate those changes.
Our analysis is restricted to the Portuguese context and should be widening. Nevertheless, we expect our work to be a point of departure regarding further understanding and implementation of ESG Part1.
References Berlin Communiqué. (2003). Realising the European Higher Education Area. Communiqué of the Conference of Ministers responsible for Higher Education. Retrieved from www.ehea.info Cardoso, S., Rosa, M.J., Santos, C.S. (2013). Different Academics’ Characteristics, Different Perceptions on Quality Assessment? Quality Assurance in Education, 21(1), 96 - 117.
Dill, D., Teixeira, P., Jongbloed, B., Amaral, A. (2004). Conclusion. In P. Teixeira, B. Jongbloed, D. Dill, A.
Amaral (Eds), Markets in Higher Education: Rhetoric or Reality? (pp. 327–351). Dordrecht:
DGES (Direção Geral do Ensino Superior). 2015. Rede De Ensino Superior: Estabelecimentos [Higher Education Network: Institutions].
ENQA (2009). Standards and Guidelines for Quality Assurance in the European Higher Education Area.
3rd Edition. European Association for Quality Assurance in Higher Education: Helsinki.
Fonseca, M., S. Encarnação, Justino E. (2014). Shrinking Higher Education Systems. In G. Goastellec and F. Picard (Eds), Higher Education in Societies (pp. 127–147). Rotterdam: Sense Publishers.
Ascribing meaning to the standards and guidelines for quality assurance in the European higher education area in Portugal Hair, J.F. Jr., Anderson, R.E., Tatham, R.L., Black, W.C. (1998). Multivariate Data Analysis, 5th ed, New Jersey: Prentice-Hall.
Harvey, L., Newton, J. (2007). Transforming quality evaluation: Moving on. In D. F. Westerheijden, B.
Stensaker and M. J. Rosa (Eds), Quality Assurance in Higher Education: Trends in Regulation, Translation and Transformation (pp. 225–246). Dordrecht: Springer.
Filippakou, O., Tapper, T. (2008). Quality assurance and quality enhancement in higher education:
Contested territories?’ Higher Education Quarterly, 62, 84–100.
Kohoutek, J., Westerheijden, D. (2014). Opening up the black box. In H. Eggins (Ed.), Drivers and barriers to achieving quality in higher education (pp. 167–175). Rotterdam: Sense.
Manatos; M.J., Rosa, M.J., Sarrico, C.S. (2015). The importance and degree of implementation of the European standards and guidelines for internal quality assurance in universities: the views of Portuguese academics, Tertiary Education and Management, 21(3), 245-261.
Prikulis, A., Rusakova, A., Rauhvargers, A. (2013). Internal quality assurance policies and systems in European higher education institutions. Journal of the Higher Education Area, 4, 1-16.
Rosa, M.J., Amaral, A. (2012). Is there a bridge between quality and quality assurance? In B.
Stensaker, J. Välimaa and C.S. Sarrico (Eds.), Managing Reform in Universities: the dynamics of culture, identity and organisational change (114-134). Basingstoke: Palgrave.
Rosa, M.J., Sarrico, C.S. (2012). Quality, Evaluation and Accreditation: from Steering, through Compliance, on to Enhancement and Innovation? In: Amaral, A., Neave, G. (Eds), Higher Education in Portugal 1974 – 2009. A Nation, a Generation (pp 249-264). Dordrecht: Springer.
Santos, S. M. d. (2011). Análise comparativa dos processos para a avaliação e certificação de sistemas internos de garantia da qualidade. Lisboa: A3ES Agência de Avaliação e Acreditação do Ensino Superior.
Sarrico, C. S. (2010). On performance in higher education: towards performance governance? Tertiary Education and Management, 16(2), 145-158.