You are here: Home / Methodology / Methodology in My Practice

Methodology in My Practice

Over the span of my practice I have adopted a range of methodologies and employed many methods, as well as supervising students for Bachelors, Masters and Doctoral dissertations as they developed their methodological approaches and applied methods. This section focusses on those methodologies that I have applied directly in my design practice.

Approaches

Design & development

The key approach in my practice has been the design and development of educational software, multimedia resources, systems and ultimately courses. This design approach has been in a context where new technology offers new and unknown opportunities and despite disquiet about technology-led approaches, has inspired creativity and innovation in my practice. The key to this research approach has been a combination of developing design methodology and rigorous evaluation in real-world contexts. In this sense, I have been unconsciously engaged in a 'design science' approach:

A design science of education should be based on a linguistic framework which offers an intermediate level of systematisation, rising above anecdotes but remaining grounded in reality. Such a framework would allow us to capture the structure of educational situations, the challenges they engender, as well as the means of addressing them, in forms which should empower learners and teachers to control their practice as much as it allows researchers to inspect it scientifically.
(Mor 2010, p14)

I would argue that the analytical perspectives I present in the Theoretical and Conceptual Framework are my version of Mor's 'linguistic framework'.

Participant action research

As my work developed, I became increasingly conscious that I was developing a participant action research approach (Denzin and Lincoln, pp33-34) to complement design & development. This was the result of a growing interest and opportunity to design courses, degree programmes and ultimately secondary (Notschool.net project) and higher education organisation ([C18] Ultraversity Project). In each case the concept of co-research with students became ever more explicit.

Specific Methods Used

Prototyping, iterative development and field testing

In developing new interactive educational software, an early discovery was that the traditional waterfall method (Bell and Thayer 1976), of identification of user needs followed by specification, implementation and testing, would not work. Participants (including myself) in the design process were discovering new needs, had little ability to specify unknown designs offering new practices and found themselves learning through the process of development in an 'expansive' sense (Engstrom 1999). A further complication in practice was that the computers in use had a range of features and capabilities and the design team would often have diverse understandings of what could be achieved. So the method employed was of prototyping initial ideas to produce a working design, not fully debugged nor complete, to inform the next steps and inspire further invention.

Prototyping was only the beginning of course, and was followed by cycles of development and field testing, often in classrooms by the teacher participants in the design and development process, whose understanding was also growing. Alongside the successive improvement in the software itself, there was a parallel and important task to develop the teachers' and students' guidance material which underwent a similar process.

As Mor puts it:

The design element in a design study may refer to the pedagogy, the activity, or the tools used. In some cases, the researchers will focus on iterative refinement of the educational design while keeping the tools fixed, whereas in others they may highlight the tools, applying a free-flowing approach to the activities. In yet others they will aspire to achieve a coherent and comprehensive design of the activity system as a whole.
(Mor 2010, p27)

Analysis

Frequently in the process, a failure in use would be identified in broad terms - a teacher or student would report that some aspect was unclear, difficult to use or simply baffling. At this point it was important to analyse the software, and the practice, to discover where improvement needed to be made. At first this was done informally and with tacit knowledge of 'what works', but later after making sense of knowledge from experts, this task was improved to make use of insights from the worlds of visual design and from human computer interface. The input from visual design theory offered clarity about the simplest ideas of placement and the overlapping of graphical elements on the screen, the treatment of 'white space', typography and combinations of colours. The input from human computer interface theory was primarily from Donald Norman regarding the task analysis of operating equipment, and resulted in our own interpretation to guide colleagues in our team expressed as An Analysis of a Single Interaction (Millwood and Riley 1988).

Survey

In later work, relating to the development of courses and educational practice, evaluation through direct questioning of participants became an additional method used. Particularly with the advent of online surveys with their immediate and low-cost summary analysis, this became an important method. Development in this methodology to take advantage of the particular strengths of interactive designs came in the design of Making Choices (Millwood 1993), a tool for modelling decisions by interactively dragging choices into order and the COGs passport a tool for transition between primary and secondary schooling.  COGS helped learners evaluate their competencies by dragging elements in a geometric design. This design thread has been developed most recently in the design of interactive learning needs analysis for health professionals and volunteers in the charity Macmillan Cancer Support.

Videography

In several projects, understanding the holistic context and seeing the detailed activity became important. In these cases, making video of the activity or of the discussion to evaluate it was employed, although this could prove challenging to access and analyse. In some cases, (Ultraversity Project 2006), the video was transcribed and the transcription added to the video as a 'text track' which was searchable. Added value could be obtained by adding text tracks for chapters and for keyword analysis, permitting the video to be used as the vehicle for dissemination of research findings, not simply the data gathered.

Structured Interview and grounded theory approaches

In creating innovation in higher education, it became important to evaluate the experience of students and tutors in greater depth. In these cases we developed interview frameworks, conducted the interviews, recorded the audio transcribed and then employed an interpretive phenomenological analysis (Smith et al. 2009) to the data to discover in a grounded sense, the key themes of their response to our innovations (Millwood and Powell, 2009).

(Words: 1118 )

"The art of losing isn't hard to master;
so many things seem filled with the intent
to be lost that their loss is no disaster.

Lose something every day. Accept the fluster
of lost door keys, the hour badly spent.
The art of losing isn't hard to master.

Then practice losing farther, losing faster:
places, and names, and where it was you meant
to travel. None of these will bring disaster.

I lost my mother's watch. And look! my last, or
next-to-last, of three loved houses went.
The art of losing isn't hard to master.

I lost two cities, lovely ones. And, vaster,
some realms I owned, two rivers, a continent.
I miss them, but it wasn't a disaster.

--Even losing you (the joking voice, a gesture
I love) I shan't have lied. It's evident
the art of losing's not too hard to master
though it may look like (Write it!) like disaster."

― One Art, Elizabeth Bishop, 1976