Interview with Cees van der Vleuten by Erin S. Barry

By  

As part of the master program in health professions education of the Uniformed Services University in the Health Sciences (US), interviews were held with established scholars. This interview was by Erin Barry with Cees van der Vleuten. Cees talks about his career, about assessment and provides some personal reflections about the field of health professions education. Thank you Erin.

Dr. Cees van der Vleuten received his MA in Psychology from the University of Tilburg (Tilburg, The Netherlands) and his PhD in Education from the University of Maastricht (Maastricht, The Netherlands) where he has worked since 1982. He was appointed a Professor of Education in the Department of Education Development and Research in 1996. He holds appointments at Flinders University (Adelaide, Australia), Western Sydney University (Sydney, Australia), the University of the Witwatersrand (Johannesburg, South Africa), and the Uniformed Services University of the Health Sciences (Bethesda, USA). He has received multiple academic and career awards including the John P. Hubbard Awards for significant contribution to research and development of assessment of medical competence from the National Board of Medical Examiners in 2005 and the Karolinska Institute Prize for Research in Medical education for outstanding scholarship in research in 2012. Dr. van der Vleuten has supervised over 80 PhD students with many more currently being supervised. As well as reviewing for multiple journals, he is highly cited within the medical education field, authoring over 900 publications (including peer-reviewed journals, books, and chapters).

It is interesting that you started with a background in psychology and then moved into education. How did you become interested in the assessment field?

It happened during my psychology training. I studied psychology with the usual interest of helping people and wanting to become a clinical psychologist, which I actually did. But while my training was ongoing, I was caught on by the science of things. I liked the scientific orientation of the training program. When I graduated as a clinical psychologist, I worked a year in the hospital and I didn’t like it. I went back to university and re-trained myself in a theory orientation in psychology, which was personality and psychometrics. I learned a lot about psychometrics. I was fascinated by the judgment literature and how people make judgements and there was a lot of ongoing research at that moment in time - research that still is puzzling - where clinical judgement is being compared to computer judgments and statistical judgements. The statistical judgments always outperformed the clinical judgments, which is quite interesting. I became interested in judgment and the combination of judgments and psychometrics. How I came to medical education was pure coincidence because back in the beginning of the 80s when I came to the labor market, there was a lot of academic unemployment. Maastricht University was new and was a growing university. I went into medical education simply by coincidence. They wanted a psychometrician, I got recruited, and ever since my interest has been in assessment. I had fun with that because I became responsible for the assessment program within the medical school. We did a couple of interesting things, like progress testing, which nobody in the rest of the world was doing. We also were among the first to really study objective structured

clinical examination (OSCE) really well during a time of OSCE development. So that’s how I became interested in the field of assessment. 

Based on your publications, you have published in multiple different areas of assessment. How did you go move between all of the different areas of assessment?

I think the story is that historically, we’ve been climbing Miller’s pyramid with different methods of assessments. This was also part of my personal development. I did my PhD on OSCE, which was new at that time. I wrote my first literature review together with David Swanson from the National Board of Medical Examiners and that became a golden oldie. It was one of the first reviews on OSCE that received a lot of citations. It was the development of every new layer in Miller’s pyramid and all aspects of that with standards settings, training, and feedback.

I think what is important, and important for my development, was that I was based in an educational environment and also a particular education environment - an environment at Maastricht University. Maastricht was the second university in the world to adopt problem-based learning. Problem-based learning means that you work on authentic problems and do look at problem solving, clinical reasoning, self-directed learning that are not easy to assess. The fact that I was in an education context has molded me very differently. I wonder what would have happened if I was recruited by the National Board of Medical Examiners? I would have become a totally different person with a totally different perspective. My psychometric perspective started to change and became much more an educational perspective on assessment. I started to wonder why do we focus only on psychometric aspects? If we try and validate our assessments (such as reliability and validity), why don’t we also optimize it towards learning? I was in an educational setting and trying to cope with constructive alignments between what we wanted to do with the educational program and what we could do with the assessment program. This constructive alignment has challenged me my whole life. I also wanted to optimize learning and I think that one of my papers that has been cited frequency was about using a utility formula. The utility of a particular assessment approach is dependent on a number of quality characteristics - reliability, validity, educational impact, acceptability, cost, resources - and you cannot have it all! You can’t have a perfectly reliable and a perfectly valid instrument and you don’t have unlimited resources. Every assessment selection is a compromise and all depending on the situation, the context, your choices will be made differently. If I am a psychometrician who is working with the National Board of Medical Examiners and I need to make decisions about certification, I will not compromise about reliability. But if I’m in an educational context, I’m willing to compromise on reliability in favor of the educational value of the assessment. This has been the basis of my later thinking because I wanted to optimize assessment for learning. Over time, I wouldn’t call myself a psychomaterican anymore, I’ve become much more an education specialist. In assessment, I strongly promote the value of education and my later approach, programmatic assessment, is a complete utterance of that - it’s a very education-minded, focused view on assessment.

Another thing I have learned in psychology with a strong experimental nature as a psychometrician, was that I was completely trained in a positivist tradition. The positivist view is possible with standardized assessment (like OSCE), but it fails you if you talk about education matters around assessment. If you want to assess complex things like professionalism, communication, team learning, the pure psychometric approach will fail you. In my work as an academic and scholar, I became acquainted with qualitative research, a different paradigm. I’ve used elements from that paradigm to widen my thinking of assessment. I think the qualitative paradigm is just as important as quantitative. In assessment I think that we should use both. If you look at my publications, you will find some of those elements. I think that when assessing more complex skills, words are much more important than scores. In assessment it’s all about scores and grades, but they are pretty poor information carriers. Narrative information is just as important as quantitative information, which is all related back to objectivity.

I made a historic journey following Miller’s pyramid and adjusting from a positivist, psychometric perspective towards a more qualitative learning perspective on assessment. My constant drive was this constructive alignment - what are we doing in education; what do we want to do in education; what are we doing in assessment? Assessment drives learning, and now I want learning to drive assessment.

Which topic area of assessment have you enjoyed the most?

I think that is my discovery of programmatic assessment because all of a sudden it hit me. It was a sort of eureka moment. Putting that together, defending it, and then seeing other people get enthusiastic about it and seeing that other people have implement it - that’s when i really get interested and excited. But I’ve had excitement throughout all of my career. I could be very excited about a psychometric analysis that we did in the very beginning with the OSCE and how we should develop that. What has truly excited me are the consistencies we’ve found in our research. Every couple of years I try to write and overview of where we are. I get excited all the time. I’ve been driven all the time.

My choice in career directions was right! When I got hit by wanting to develop as a scientist, as a scholar, that really gave me the satisfaction of trying to make the next steps and push boundaries all the time. That has been quite exciting all along the way and there are a number of developments ahead of us that I can be very excited about. To give an example, what needs to be done in the near future, is to see if we can use patient outcome data to inform performance of an individual learner, such as a post graduate resident or clinician. I think that’s a new, exciting area that we need to discover. I’m constantly excited. Over the years, I’ve had the privilege of working with a lot of talented people who wanted to do a doctoral dissertation with me. I can tell you lots of exciting stories with lots of nice individual journeys. I can’t say there’s a single piece of excitement. It’s been a long journey working with a lot of very bright people doing research.

I’ve had many offerings to work in other places internationally, but the nice thing about Maastricht University is that it’s quite conducive to doing scholarly work in education. That also

has been quite an exciting thing. We started problem-based learning in 1982. It was not based on scientific evidence and over the years, we’ve been doing a lot of research that has been a lot of fun too. We now have a much better understanding of what learning is and how to best teach that.

Where is the field of assessment going?

The future of the field is going into programmatic assessment and also looking at feedback from patient outcomes. Naturally lots of challenges because it’s not easy to contribute patient outcomes to individuals and there is also a lot of team performance. It’s complex.

Communication seems to be a big place of where to go. Students don’t always understand the importance of effective communication. An example is one of my student’s PhD dissertation on communication where she compared two learning settings in post-graduate training - one in general practice and one in surgery. General practice spent a lot of time on communication training, particularly on video assessment for feedback. She demonstrated that communication is developed as a personal thing, which can be easily applied for clinical purposes. You can switch communication strategies to achieve certain clinical outcomes. In surgery, they had no communication training whatsoever, but the residents said they’d love to learn more, but didn’t want to involve their supervisors because they didn’t know how to do it - they were the negative role model.

Moving into best practices, with such a busy schedule, how do you find time for planning your research, conducting the research, and writing on top of all of the teaching you do?

I’ve learned valuable lessons. When I was younger I thought I could do it all, and you can’t do it all. I have a family with four children, a busy job, my wife had a busy job, and you pay the price for that at some point in time. And I did. There was a period where I got overworked, but it helped me a lot because it opened my eyes and I started to care more about my own time and my time for the family. What I do now is I schedule moments where I can work on myself. I do that by not coming to the office one day a week and by planning weeks, usually 2-3 in a year, for writing periods. That’s what keeps me sane. In the past, everything went to the evenings and the weekends. By blocking my time, I was able to have a better balance. If you ask my children, they would tell you that dad works too much and I was a negative role model to them. Interestingly, they work too hard now too. To better structure my work time is my answer to you. You can’t do it all. I had a colleague once say to me, that he was going to give me a piece of paper with a hundred different languages of saying no. Another other guilt I had was when I was at the football field or swimming lessons, I was always sitting there with a paper. I understand how difficult it can be now. Also, in the beginning, international travel was difficult, but it gets a lot better once my kids were older.

When looking at journals for publications, how do you decide where to aim when starting to write?

I think I’ve become very mindful about what kind of audience I want to tell the story to and that defines the journal. You get to know the journals a lot better and you get to know the audiences behind the journals so the matching process is getting better all the time. So for example, if you submit to Academic Medicine, you very much address a US-based audience. For Medical Teacher, your audience is international and much more practical than an audience of medical education. Or another example, I’m involved in simulation of postpartum hemorrhage in obstetrics and it’s important to have a clinical audience there to understand the outcome of the results. It’s the audience that defines the journal. I’ve published in journals like the Lancet and then you have a completely different audience. You have to be very careful with your language and cannot use any jargon. It’s a long learning process of learning your audience and how to write for them.

Do you have any favorite journals that you like to work with or does it depend on the topic?

It depends on the topic, but I do like Medical Teacher a lot because it connects to people in practice. There’s a big complaint in the literature in general education that the educational research is not relevant to teachers in educational practice - in other words - the practitioners/teachers and the researchers are disengaged from each other. What is unique to medical education, is that this gap is not as wide as in general education. There are a zillion education meetings around the world, where both practitioners and scholars meet. Many of our masters and PhD students are teachers in their own institutions. I don’t know any other fields in the professions that has so many education journals. I’ve lost count on the number of journals that we have. This is quite unique to medical education which we should really cherish. You will learn a lot more about educational evidence and theory and later on you will apply that to educational practices. So therefore, I like Medical Teacher because it still has this practical approach and addresses a practical audience.

How do you decide which conferences and workshops to attend?

I think the answer again, is with which audience do you want to engage yourself with? If you look at the meetings over the globe, there are different audiences. For example, we just had the OTTAWA conference in Abu Dhabi which is an assessment conference. But those are different people that you just meet. For young people that I mentor, it’s important that they have exposure to different audiences. For example, I often discourage people to not go to the AAMC meeting because it’s very American and I encourage people to go to AMEE because everyone goes and you see a lot of people. But I also exposure them to meetings in Asia. Again, it’s the kind of audience you want to address. Expose yourself internationally. There’s a very different approach to education in different parts of the world, very different cultures. If you compare the

assessment culture with the one in Denmark or Finland, they’re completely different cultures. And very different from Asian and Middle Eastern. Expose yourself and learn. 

When conducting research, how do you decide who to collaborate with on projects?

Three things: people who I can have fun with; those you can be productive with; and where relationships are reciprocal. If there is an imbalance in the relationship, it won’t work. Or when people start profiting from you, it won’t work. I’m quickly done with those people. Should be fun, productive, and reciprocal.

What do you know now that you wish would have known early on in your career?

That’s a very good question, because I came to education completely naive and over the years I’ve learned a lot as far as what is important in education and it’s this kind of built up expertise that gives me my personal wisdom. They say that there is no shortcoming to experience, so it’s impossible to have this experience early on in your career. And you have to learn on your own and try and engage in situations where you can learn and work with other people so that you are able to learn. And follow your heart. Do the fun things. Follow your curiosity.

I was listening into a similar conversation with David Irvy from UCSF recently and he came from a different direction - he started with qualitative research and then got exposed to quantitative and statistical methods, but also saw the value in both. He was asked how do you seek out opportunities, seek out collaborators, and types of research, and he’s very driven by his drive to learn. I just wondered how much learning and your opportunity to learn something different or to expand your knowledge is implicit of that fun (Sebastian Uijtdehaage, PhD)?

Absolutely - that’s one of the reasons I wanted to be an academic. It’s the learning and the freedom that has motivated me to do what I have done and I’m still learning. In five years, I have to retire, so I’ve had a long career, but I still believe that I’m learning.

You mention all of the different scholars, journals, and investigations that happen in the field, so how do you stay on top of the literature (Kelsey Larson, PhD)?

That’s a challenging questions because first I muddle through. Fortunately nowadays, all journals are sending out electronic table of contents and I scan those. But I guess, most of my exposure to the literature is through my PhD students. They are able to read a lot more than I do and they keep me posted and through that I learn a lot and pick up a lot. You always have a particular topic that you follow. I won’t miss a paper in the literature on programmatic assessment because I’m so interested and want to explore what happens in the world within programmatic assessment. I will systematically looks through the literature for those papers.

You noted the closer links between research and practice in medical education as compared to general education. What do you think we can do in the field of medical education to strengthen these links even more (Anthony Artino, PhD)?

Here’s a fundamental difference. What I see with health professions education units around the world, is being part of education practice yourself. In my faculty and my education group, we not only do education research, but we also provide services to the training programs here. So we do faculty development, program evaluation, instructional design. I just participated in a retreat on our next curriculum revision in medicine, and I’m going to tell you it will be very different if we can achieve those plans from what we have right now. Being in the middle of educational practice I think has a lot of value. To give you an example, I am a regular faculty development trainer for clinicians who want to learn more about how to assess in the workplace. I’ve had a zillion debates with clinicians on assessment in the workplace and I think that has shaped my knowledge around assessment as much as scientific publications. So the connection between education research and practice is an important one. Different health professions education units have different positions on that. I see units coming about, particular in Canada, who only focus on research. I think that within my own institution, we should have internal relevance. Our own expertise should be relevant to our own teaching and learning practice. I think if that is not the case, then it is different. Again, I like this close connection between educational research and educational practice and if we can support that, develop that, promote that in any way, I would be in big favor of that.

Erin S. Barry