Cover photo of Social Policy Journal

Increasing social science research capacity: some supply-side considerations

Neil Lunt
Lecturer School of Social and Cultural Studies Massey University at Albany

Carl Davidson
Director No Doubt Research Limited, e-centre Massey University at Albany


Abstract

This paper discusses the development of policy-related research capacity, with particular reference to the place of tertiary education. With regard to the recent demands for more policy-focused research, it argues that deficiencies on the supply side may prove a systematic inhibitor to a more satisfactory “utilisation equilibrium”. These are issues that the public sector in the broadest sense need to act upon. Otherwise there is the danger that a capacity gap will result in growing dissatisfaction with the search for answers to the evidence-based questions “What works?” and “Where is the proof?”


Introduction

An important shift in policy thinking, both locally and internationally, has been the call for better evidence on which to base policy, programme and professional intervention. At all levels of policy development key questions being articulated are “What works?” and “Where is the proof?” The question of what qualifies as adequate proof has always been a complex one, combining as it does political influences, organisational memory, tacit knowledge, and the persuasiveness of various proponents (Polanyi 1957, Kuhn 1962, Lindblom 1979, Smith 1999). However, and perhaps even because of a recognition of that complexity, within the range of sources privileged as acceptable as evidence, research has always laid claim to a prized place.

A number of factors are exerting pressure for evidence-based, or at least evidence-influenced, activity. The debate is a wide one, encompassing a plethora of policy spheres, disciplines and methodologies. A key driver here is the continuing emphasis within the public sector to secure value for money, close gaps, ensure accountability, and improve service delivery. It has been suggested that high-quality evidence is increasingly being demanded as part of the policy process (Davies et al. 2000, Ministry of Social Policy 2001). There has been recognition of the limited part played by more focused evaluation, and a debate has re-emerged about the relationship of research, evaluation and policy (State Services Commission 1999, Smithies and Bidrose 2000, Ministry of Social Policy 2001). We say re-emerged because the debate about the role of research and its links to government policy is a longstanding one, with discussions around the use of social science within government beginning in the 1930s (cf. Robb 1987) with the Social Science Research Bureau; growth within the government sector and the universities in the post-war period (Marsh 1952, McCreary 1971, Mackay 1975) and disillusionment from the 1970s and the corresponding search for an appropriate organisational model and configuration of social science (e.g., Gibson 1970, Shields 1975, Fougere and Orbell 1975, Keir 1982).

The purpose of this paper is not to rehearse debates about the value of research in the policy-making process, but to offer some ideas about preconditions for usage. The paper then identifies some of the barriers currently present in the tertiary and public sector that will – in all probability – militate against that maximisation, and it ends with some suggestions about what could be done to overcome those barriers. Essentially, our discussion is driven by what the Ministry of Social Policy (2001) discussion document refers to as a “rightward shifting demand curve by central agencies”, and what we identify as a particular bottleneck to supply emanating from the tertiary sector.


Internal and External Capacity

High-quality, policy-relevant work may be sourced from a range of suppliers. Here we identify three main types:

  • an agency’s own internal research and evaluation capacity;
  • a contractor–commissioner relation (with the contract going to the tertiary sector, market research sector, private contractors or a mixture); and
  • a hybrid relationship, which may involve internal research capacity combined with external modifiers – typically consultants bringing specific skills.

Genuine contractor–commissioner relationship took hold from the 1970s, with grants-in-aid being replaced by tighter contractor specifications. The period since has seen a refinement of the contractor relationship and “Rothschild principles” (Kogan and Henkel 1983), with research commissioning and management an identifiable public service role (Hawkins 2000).

A range of steps have been identified as crucial to ensuring good-quality research, and these focus on:

  • effective commissioning;
  • effective research in the field; and
  • effective utilisation.

Clearly, these stages are in practice interrelated. Determinants of the final quality and impact of commissioned research include (but are not restricted to):

  • adequate conceptualisation of research and framing of questions – projects that ask interesting, but not the most policy-relevant, questions, or adopt inappropriate methods are foils to evidence-based activity;
  • research that has clear and achievable objectives, and promises outcomes that are timely;
  • effective research management and liaison to ensure early identification of problems (commonly, scope creep or timeframe compression);
  • rigorous analysis that addresses research questions and has policy relevance, but also recognises the limitations as well as the strengths of the research; and
  • highly trained and prepared researchers, with adequate supervision, working within optimal-sized teams (Weiss 1977, Weiss 1998).


Many of these considerations were identified and discussed by the State Services Commission in its timely review of evaluation and policy advice (1999), and are echoed in the social policy capacity review and Ministerial Statements (Ministry of Social Policy 2001, Minister for Research Science and Technology 2001, Social Science Reference Group 2001). The State Services Commission’s report focused on a range of issues around the utilisation of the research results. Discussion of research utilisation and relationships between research commissioners, policy advisers and Ministers are to be welcomed. Our discussion here is far more modest in scope, and driven by the fifth point we have outlined above (researcher capability). Arguably, it is one that must be addressed for evidence-based thinking to gain a stronger foothold in particular policy areas.

Instead of focusing on the demand side of social science research (what research gets commissioned, and what happens to the products of research), we discuss supply-side considerations and, in particular, research training and preparation. The piece is informed by historical and contemporary reflections on the development of social science and social research within New Zealand; teaching and writing in the area of social research methods; and our experience of contract research work and relationships with government departments.

Broadly, we argue that better research preparation will contribute to narrowing the gap between supply and demand of policy-relevant research. We suggest a number of possible directions, including rapprochement between academic teachers/researchers and those based within central departments. We write from disciplinary traditions of sociology, political science and psychology, drawing on backgrounds that lean towards social – as opposed to more economic – research. Many comments we make, however, will also speak to those more heavily economic constituencies.


Revisiting the supply side: the organisation of applied social research

It is worthwhile briefly reviewing the roles of those involved in the contracting/ commissioning relationships, and how these roles have altered in recent years as the result of particular pressures.


Government Departments

Government departments have had a relationship with social (and economic) research that stemmed from a desire to collect routine statistics and information about their activities to inform reporting and service delivery. Some of the earliest examples of specialised policy advice being utilised in making government policy were in the form of economists giving evidence to Royal Commissions and establishing networks with senior public servants and Ministers (Endres 1987, Endres and Fleming 1994). Governmental research capacity was largely established during the 1950s and 1960s. The 1970s was a period of continued development and discussion about the appropriate place of social science research – and growing pressure (albeit unsuccessful) to establish an independent social research agency away from immediate short-term pressures (Shields 1975, Fougere and Orbell 1975). The 1980s and early 1990s can be seen as times when less emphasis was placed on research activity, and major changes were introduced without (and sometimes in the face of contradictory) research evidence. There was a running down of research capacity, and social science was seen to have a more limited contribution to make to society and policy than previously anticipated. There was a focus on a priori economic models as a basis for action, rather than an investigation of the interplay of theory, action and evidence (cf. Easton 1997).

Since the early 1990s there has been increased emphasis on applied research, with a particular focus on evaluation of effectiveness. Central to the latter has been the attempt to address the question, “Does it work and how can we improve it?”. There has been an attempt to go beyond costs and inputs, to prioritise outcomes and impacts. The language of evaluation (formative, summative), a desire for outcomes, and stakeholder consultation within the research process, are increasingly emphasised as part of central agency activities. A range of factors drives the process: value for money, outcomes, accountability, and the desire to deliver appropriate services to diverse communities (Business Information in Action 1999).

The second major research role is that of commissioner. Frequently departments lack in-house capacity and, consequently, engage in a range of commissioning relationships: these include the Department of Labour, Ministry of Justice, Te Puni Kokiri, Ministry of Social Development, Department of Internal Affairs, to say nothing of the vast area of health-related research activity.

Tertiary Role

Tertiary institutions occupy three distinct research positions. First they are potential contract providers of research (either on their own or as part of a broader research team). In New Zealand there are few research centres outside of health with a primary function of full-time research (apart from those that have arisen for Maori and Pacific Island research). In the field of non-health social policy, most research units or centres are more virtual than reality, a situation well captured by Bulmer’s comment that within teaching departments there are those “who may have a letterhead for a research ‘centre’, but who are university teachers who do research part of the time” (Bulmer 1987b:233). Indeed, those who are teaching within the tertiary system can expect to be teaching more and bigger classes than they did a decade ago.

There are some advantages to this, given that the second major research role of tertiary institutions is that of teaching research methods. This will include introducing students to the range of concepts, discussions and skills that contribute to the field of research practice. For such teaching to be effective (and hence useful), it is paramount that those who teach research ensure their skill sets and experiences remain current (a point we will return to below).

The third role is that of critic and conscience. This echoes long-standing views of the New Zealand University system that were made by Karl Popper and colleagues at Canterbury in the post-war period:

The commonly held view that the University is primarily a teaching institution should be abandoned, and the University should be looked upon as an institution in which the spirit of free inquiry is preserved and cultivated. (Allen et al. 1945)

We do not wish to lament how recent changes within tertiary education and broader society have significantly changed the role of the university. In part such appeals are to a mythologised past: the university has historically had multiple functions, with only one of those being as critic and conscience (Kerr 1963, Parton 1979, Rothblatt and Wittrock 1993). With research training as one of its functions, what is required is a consideration of the demands exerted by the real world of social and economic research. This includes equipping individuals with critical and applied research skills.

Market Research

Given the absence of research centres dedicated to full-time social research, those who commission research have often turned to market research providers to meet their needs (e.g., Colmar Brunton 1997, Forsyte 1998, 1999). While a number of market research companies have provided sophisticated and considered research outcomes, in many cases the structure of the industry militates against this. Specifically, those recruited to market research have a different disciplinary background (and, often, skill sets) to those who work in social science research; for instance, drawing on graduates with a background in marketing or business in contrast to the sociological, anthropological or social policy background of dedicated policy researchers. This difference in emphasis has significant consequences in terms of enabling market researchers to embed their research in a broader context, or to be sufficiently critical of the structural (but often informal) constraints on policy application.

Furthermore, the market research industry is one in which successful researchers are highly mobile, making it difficult for government departments and agencies to form meaningful alliances (for instance, through the creation of shared intellectual capital). Finally, the market research industry places less emphasis on the politics and ethics of research, being much more instrumental in orientation (Curtis et al. 2001, McKernon 2001). This has consequences for the development of research processes and procedures that take into account the different (and, sometimes, contradictory) demands of the various groups included in a research population. In short, market research is an activity that is more likely to deal well with method but much less well with questions of the broader methodological context. Questions of ontology, epistemology and metaphysics are almost entirely lacking from mainstream market research. There is a failure to acknowledge that these even underpin research activity, and how description and typology displace explanation, causation, and understanding. It is our contention here that government departments and agencies cannot afford to be so atheoretical in their own research demands.

Of course, there is a subset of public sector social research that is entirely the province of market research companies. For instance, researching customer satisfaction is something that market research companies (with their CATI – Computer Aided Telephone Interviewing – capability) will always be able to do better than dedicated social research consultants, or even a dedicated social science research institute. That said, we would argue that satisfaction surveys are the handmaiden of service monitoring, and contribute little compared to research and evaluation for strategic policy development.

Private Providers

Precisely because both university departments and market research companies are generally poorly adapted to meet the strategic research needs of government departments and agencies, we have seen the rise of a number of dedicated social/policy research consultancies. A number of these consultancies (and especially some of the Wellington-based ones) have delivered policy-useable outcomes. These companies are often formed around a core of significant public sector experience, or provide specialist niche-based services (for instance, expertise in relation to Maori research and consultation, or specialising in research with children, criminal justice, or the field of disability). The skills, experiences and theoretical orientation of these companies make them preferred partners for government departments and agencies. However, the irony here is precisely that these organisations are often staffed by researchers and managers who learned their trade in the public service and were outplaced in the downsizing on the 1990s. How these bodies of expertise and experience can be utilised to generate internal capacity is a crucial consideration (Davidson and Lunt 2001, Davidson and Voss 2002).


The problem with method: capability, capacity, and capture

We have outlined the broader problems with trying to use either university departments or market research companies to provide this capability. Although there are a number of consultancies that do provide appropriate research capability, their capacity for research is limited. There is a major bottleneck to increasing the number of high-quality researchers, which is the role of the tertiary sector.

An important part of the problem here is with how research is frequently taught at tertiary institutions throughout the country. This problem is a little easier to understand with a simple riddle. Q: When it comes to reading a research report, what is the difference between a social science undergraduate, a programme funder and a teacher of social research methods? A: The undergraduate reads the text and skips the tables; the policy maker reads the tables and skips the text; the methods lecturer cares little about either the tables or the text, as long as they agree with each other.

The point being made here is that it is perfectly possible (and, worse, highly probable) that these three different people will inhabit separate worlds, will have few opportunities to talk to each other, and will end up speaking in a different language. The real paradox for anyone wanting to pursue a career in social research (whether in central or local government, or as a consultant) is that, although the horizons are bright, future researchers are frequently exiting the tertiary system ill-equipped for the demands that will be placed on them (Tolich and Davidson 1999).

The horizons are bright given the push for evidence-based policy, practice and intervention, and a more favourable policy climate for the contribution of social science. In part this is about the more balanced view of social science that considers theory, rigour, application and limitations. However, opportunities for the rehabilitation of social research into the political body will be lost if we do not address the extent to which we are meeting that need.

What is Wrong with Method?

Social research method was slow to develop within the tertiary context. Despite the earlier contributions of economics, anthropology, psychology and social work, it took the development of sociology departments, appointments and chairs (in the mid to late 1960s) and the expansion of social science and applied research to cement social research methods within the university curriculum. In line with a great deal of humanities and social science teaching at this time, courses drew on overseas texts and examples (British and American). In terms of social research in New Zealand, there was no landmark New Zealand text or thinkers (cf. Timms 1970, Lunt 1999).

Regardless of the development of professional bodies and the great expectations around the place of social science contribution from the universities, there has always been criticism about lack of attention to policy-oriented topics by social and economic researchers (cf. Holmes 1981:14). For example, despite huge developments of sampling theory, there has been a criticism of academic economics as preoccupied with a priori theorising and abstraction as against real-world circumstances and conditions (see Easton 1997 “Epilogue”). In part, the search by economics and sociology for more academic and theoretical credibility contributed to this division. Professional incentives also detracted from more of a practice focus – overseas publishing outlets were less likely to publish material that retained a solely domestic policy focus. This division is reflected in the internalising of research method, driven by sustained theoretical debates. These debates, although important, removed a great deal of policy-research energy.

While recognising there are historical factors, we identify a series of contemporary dimensions:

  • the qualitative onslaught;
  • a distrust of policy focus; and
  • “bridging the gap”.

The Qualitative Onslaught

Let us introduce this point by noting that we both consider ourselves qualitative researchers (having conducted a great deal of such research), that we have written widely about qualitative research (e.g., Tolich and Davidson 1999), and that we certainly recognise the value of qualitative research. As such, what we have to say here may initially appear strange – namely, that the onslaught of qualitative research is something that needs to be moderated within the policy process. Bear with us while we develop our line of reasoning.

To start with, although teaching pluralistic accounts of research (populated with different voices) is important and necessary, this needs to be accompanied by an understanding of the institutional and/or policy context that attempts to reconcile that pluralism – and especially so where the power associated with the competing perspectives is unequal.

Second, it makes little sense to have researchers who specialise in qualitative research yet do not appreciate that there are situations when such an approach is either inappropriate or less useful than the alternatives. This means that researchers need to be trained to recognise the limitations of their own approaches, or sufficiently multi-skilled to be able to shift approaches when the question at hand calls for it. Most practising researchers recognise that the dichotomy of quantitative and qualitative research is a false one, yet it is one that remains among some teachers of research methods. In the worst cases, the arguments against multi-skilling researchers seem reminiscent of the “four legs good, two legs bad” arguments of George Orwell’s Animal Farm.

The question of multi-skilled researchers (in the broadest sense of method) leads to our third criticism: that much that passes as qualitative research focuses on the method of gathering data but not the logic of the research design. The act of data gathering, though, builds on both the kinds of questions asked (and how they were developed) and how the participants were selected to take part in the research (sampling). The logic of the research design has important consequences for the kinds of claims that can be legitimately made for the data collected, and these limitations need to be at the forefront of the researcher’s mind. To take the obvious example, how many times have we seen research providers (or commissioners) attempt to generalise from a qualitative research initiative? Our favourite example involves a research project built around a small number of focus groups, and a client asking “What is the margin of error for these results?” The social policy discussion paper reminds us how these background factors translate into inconsistent qualitative research for policy usage (Ministry of Social Policy 2001).

Ironically, our major criticism of qualitative research is that far too much of it is taking place in one kind of intellectual vacuum or another (be it in terms of policy or institutional context, a broader epistemological context, or even a specific methodological context). This can result in the dismissal of whole disciplines (e.g., statistics or economics) or discounting particular research traditions (e.g., quantitative methodology within sociology). This is seen in an emphasis on gaining accounts, exploring voices, and building rich descriptions, over achieving understanding, explorations, comparisons and evaluations. Our experience is that this is especially likely to occur at the postgraduate level, both master’s and PhD. As a result, we believe that the post-modern turn of looking for discursive articulations in every phrase, whatever else it may have achieved, has inadvertently turned a whole generation of researchers away from a meaningful contribution to policy – both in terms of substantive orientation and the skills being privileged. Needless to add, we consider a great deal of this research to be low quality to boot. It is our contention that the postgraduate years do not give our “researchers” the training or experience they should (or once did) and are not as useful as they could (and should) be. Moreover, the pull of Grand Theory has had a tendency to mask optimism and to discourage a fiddling with minutiae in preference to broader theory. Identifying power and oppression are central, but must be accompanied by notions of how we are to move beyond such positions.

Distrust of Policy Focus

As we have noted above, a serious problem underpinning the question of policy research capability and capacity is simply a lack of policy-focused undergraduate and postgraduate work. The fact that some of our universities have developed departments specifically focused on social policy may be a factor here, by inadvertently claiming such research questions for their own constituents. And yet there is a range of academic disciplines that could and, again, should, be contributing to this work (e.g., anthropology, economics, geography, sociology, social work, politics and history, to name a few). Let us be clear that we are not supporting a clear break between pure and applied research here, nor arguing for the demise of the former. We are simply arguing the case for more problem-focused, policy-relevant, applied work, particularly from those disciplines that identify such activities as their remit and a strength.

The fact is that we have matriculated generations of students who – while they are aware of theoretical debates around research as quantitative, “gendered”, disempowering, Eurocentric, etc. – are without the skills to construct a solid research design that recognises the difference between a method and a methodology (or a method and a design) and addresses practical, real-world considerations around feasibility and practicality. Thus at both undergraduate and graduate levels it is not uncommon to find critical faculty not matched by an understanding of the real research context. The danger is one of preparing people for academic careers that do not exist. There are wider implications, because individuals are not only ill-equipped as social researchers, but also miss out on skill sets that make social scientists valuable in the jobs market – old-fashioned thinking and problem-solving skills (Davidson 1999, Davidson and Lunt 2001). Our hope is that the debate about evidence-based policy and practice may encourage more attention to qualitative work, but with a greater willingness to attend to the issues of scale.

Bridging the Gap

Research is not solely about having a knowledge of method. It is about the application of method. Essentially it is problem solving. Thus it invokes shades of what Schn (1983) has called reflective practice and knowledge in action. The process of research both in and for the real world requires an understanding of context; feasibility needs to be an integral part of any research design and planning. Because the world of real research is one in which things do go wrong, where programmes being evaluated are slow to reach their optimum intake, where interests and egos do collide, researchers must be cognisant and responsive to this.

The tertiary sector would be well advised to consider some Weber-like notions of the sorts of conditions for ideal policy evaluation, such as having a long timeline, a patient policy audience, sufficient resources, extensive groundwork, and a stable programme that has worked out implementation issues (Hawkins 1999). Given that reality is very different, this points us towards rethinking how we prepare future researchers for the messy world of policy and research.


Supply-side Opportunities

There is a series of opportunities for the development of social research methods within New Zealand. These include the evidence-based turn in policy making, pluralism of research designs and methodologies, and new texts focused on New Zealand social research methods. A series of links needs further developing and fostering, and we look at a number of examples below.


Policy–Tertiary Links

Policy–tertiary links include links between Auckland and Wellington research/academic communities, and between government departments and the tertiary environment. The social policy paper notes the absence of a Wellington-based, academic-focused social policy department. Particularly pressing is the need for an evaluation qualification and, indeed, a locally grounded standard text (Lunt, McKegg and Davidson in press). A range of strategies may foster links, including jointly contributing to training and courses in building capacity. Internships, contract research, discussion on postgraduate topics and funding are other dimensions. The recent Social Science Reference Group has identified a number of strategies aimed at improving linkages between government and tertiary sectors, one of which is around making better “connections”.

Maori and Pacific Peoples Research Capacity

A pressing gap is that of Maori and Pacific peoples research capacity. This is driven by a number of considerations, including:

  • Treaty obligations;
  • population change; and
  • a desire to be more responsive to communities in how resources are distributed and services are delivered.

A key part of this is better research and evaluation activity – for effective consultation, service planning, and debate around meaningful outcomes. Courses and supportive relationships must ensure a future supply of well-skilled and confident researchers, able to undertake policy-focused work for central policy agencies and iwi authorities. Many research courses deal inadequately – if at all – with such issues, appending pieces of a more holistic kaupapa framework to a pre-existing curriculum. The current funding of tertiary education may not by itself be enough to bring forward the next generation of researchers, and it may require a strategic central push to ensure adequate capacity. The recent establishment of a centre of excellence for Maori Development may provide some impetus here. But we would suggest that the onus is on all institutions offering social research courses to become centres of excellence in how they teach about research by or with Maori.

Tertiary Training and Education

There needs to be recognition that intensive supervision of postgraduates and undergraduates is required for good research preparation. Induction into a real world of research also implies that staff members themselves are undertaking research. Mentoring is crucial for ensuring the development of future researchers (Social Science Reference Group 2001). There is a limit to the extent to which good research can be done from the desk – it is a field activity. And field activities are labour-intensive.

Moreover, the postgraduate relationship may need some rethinking. If the amount of postgraduate work that gets published is any indication, there is still a long way to go in terms of high-quality, policy-focused preparation. A number of government agencies offer studentships (e.g., the Department of Child Youth and Family Services, and the Ministries of Transport and Labour). Universities also need to look at other things, however, including developing models of mentoring (getting postgraduates on research teams) and offering them a range of opportunities to develop skills. This may include funding from the Foundation of Research Science and Technology and the Health Research Council, as well as funding from local authorities and commercial bodies.

The question of research training goes beyond hands-on experience (and mentoring) of research skills. It extends into the question of the real-world relevance of the disciplines that students are trained in. To take one example the authors are familiar with, the Sociology Association of Aotearoa (New Zealand) has recently (once again) focused on the question of the role that sociology can and should play in our wider society. This issue was the central focus of the Association’s 2000 conference, and a recent issue of the journal New Zealand Sociology dedicated a symposium to addressing the issue. The relevant point, for the argument presented here, was that a number of symposium/conference participants were clear that sociology, first and foremost, needs to demonstrate its utility for understanding real-world questions. To put this another way, the discipline needs to restore the link between systematic inquiry and social purpose (Mills 1959).

So that is the future.

The encouraging news is that a number of these developments have also been suggested elsewhere, or are already in existence but need expanding (Social Science Reference Group 2001). But it will take time for some of these current bottlenecks to be worked through. Returning to an earlier point, a key role for private and tertiary contractors is to contribute to research capacity building within central government. Sometimes undertaking research, sometimes advising, but always attempting to leave a research culture behind, as well as recognising the constraints of the policy game.


Final Considerations

Some of the developments we hope for may be encouraged by broader, parallel developments. These include attempts to secure strategic policy-relevant research, which may perhaps increase recognition of sites of expertise (rather than individuals). Such discussion alludes, for example, to areas where there is a dearth of research activity, including cross-sectional activity (Minister for Research, Science and Technology 2001, Ministry of Social Policy 2001). Capacity building, therefore, is premised on continued dialogue concerning both supply and demand.

In the shorter term, the annual conference highlighting good applied work and method and where academics, researchers, postgraduates and policy-makers can mix, should be beneficial for all concerned. Another step is an evaluation qualification collectively funded, focused and delivered by the policy, tertiary and iwi communities – one that goes beyond attempts to graft it onto existing research courses. Finally, although the current policy climate for social science is warmer than in some seasons, a sudden drop in temperature may frustrate current hopes. Ensuring that high-quality policy advice has a part to play in the policy process is the responsibility of a number of institutions, and ongoing vigilance must be maintained.


References

Allen, R., J. Packer, J.C.Eccles, H.N. Parton, H.G. Forder and K.R. Popper (1945) Research and the University: A Statement by a Group of Teachers in the University of New Zealand, Caxton Press, Christchurch.

Bulmer, M (1987a) “Social science in an age of uncertainty” in M. Bulmer (ed.) Social Science Research and Government: Comparative Essays on Britain and the United States, Cambridge University Press.

Bulmer, M. (1987b) “Varieties of methodology: strengthening the contribution of social science” in M. Bulmer (ed.) Social Research and Government: Comparative essays on Britain and the United States, Cambridge University Press.

Business Information in Action (1999) “Improving public sector policy through quality evaluation” Business Information in Action Conference, Improving Public Sector Policy through Quality Evaluation, Wellington, 26-27 May.

Colmar Brunton (1997) Survey of Labour Market Adjustment under the Employment Contracts Act, Colmar Brunton, Auckland.

Curtis, B., D. Hoey and S. Matthewman (2001) “Different but the same: business ethics and university ethics, an(other) perspective” in M. Tolich (ed.) Research Ethics in Aotearoa New Zealand, Pearson Education, Auckland.

Davidson, C. (1999) “Selling sociology: better brands or bargain bins?” New Zealand Sociology, 14(2):223-240.

Davidson, C. and P. Voss (2002) Knowledge Management: An Introduction to Creating Competitive Advantage from Intellectual Capital, Tandem Press, Auckland.

Davidson, C. and N. Lunt (2001) “Where did all the good researchers go?” presentation to Auckland Evaluation Group, 4 December, Auckland.

Davies, H., S. Nutley and P. Smith (eds.) (2000) What Works?: Evidence-Based Policy and Practice in Public Services, Policy Press, Bristol.

Easton, B. (1997) The Commercialisation of New Zealand, Auckland University Press.

Endres, A.M. (1987) “Economics thought and policy advice in New Zealand 1927-1935: accommodating a tradition of policy activism” Working Paper in Economics, No. 32, Department of Economics, University of Auckland.

Endres, A.M. and G. A. Fleming (1994) “Monetary thought and the analysis of price stability in early twentieth century New Zealand” Working Paper in Economics, No. 139, Department of Economics, University of Auckland.

Flather, P. (1987) “‘Pulling through’ – conspiracies, counterplots, and how the SSRC escaped the axe in 1982” in M. Bulmer (ed.) Social Science Research and Government: Comparative Essays on Britain and the United States, Cambridge University Press.

Forsyte (1998) Experience of the English Language Bond, Forsyte, Auckland.

Forsyte (1999) New Zealand Domestic Travel Study, Forsyte, Auckland.

Fougere, G. and J. Orbell (1975) “Proposal to establish a centre for the study of New Zealand society” Australian and New Zealand Journal of Sociology, 10(3):227-9.

Gibson, R.E. (1970) Report on Social Science Research Services, tabled to National Research Advisory Council, 8 December, Wellington.

Hawkins, P. (1999) “Moving towards quality evaluation” paper presented to the Business Information in Action Conference, Improving Public Sector Policy through Quality Evaluation, 26-27 May, Wellington.

Hawkins, P. (2000) “Is contracting good for evaluation?” paper presented to the Auckland Evaluation Group, 12 October, Auckland.

Holmes, F. (1981) Address to National Research Advisory Council Conference Social Science Research in New Zealand: The University, the Government and Other Agencies, Wellington.

Keir, M. (1982) Education and Training of Social Scientists in the Public Service, Co-Ordinating and Advisory Committee for Social Science Research in Government, Wellington.

Kerr, C. (1963) Uses of the University, Harvard University Press, Cambridge, Massachusetts.

Kogan, M. and M. Henkel (1983) Government and Research: The Rothschild Experiment in Government Departments, Heinemann Educational Books, London.

Kuhn, T. (1962) The Structure of Scientific Revolutions, Chicago University Press.

Lindblom, C. (1979) “Still muddling through” Public Administration Review, 39(6):517-25.

Lunt, N. (1999) “The academic discipline of social policy” Social Policy Journal of New Zealand, 12:1-16.

Lunt, N., K. McKegg and C. Davidson (eds.) (in press) Evaluating Policy and Practice: A New Zealand Reader, Pearson Education, Auckland.

MacKay, I.J.D. (1975) “Origins of social research in government agencies” New Zealand Journal of Public Administration, 37(2):57-72.

Marsh, D.C. (1952) “Old people in the modern state” Political Science, March, pp.23-28.

McCreary, J. (1971) “School of Social Science Part One” New Zealand Social Worker, 7:1.

McKernon, S. (2001) “Designing an ethics code for qualitative market research” in M. Tolich (ed.) Research Ethics in Aotearoa New Zealand, Pearson Education, Auckland.

Miller, R.B. (1987) “Social science under siege: the political response 1981-1984” in M. Bulmer (ed.) Social Science Research and Government: Comparative Essays on Britain and the United States, Cambridge University Press.

Mills, C.W. (1959) The Sociological Imagination, Oxford University Press, New York.

Ministry of Social Policy (2001) Strategic Knowledge Needs in Social Policy, Wellington.

Minister for Research, Science and Technology (2001) “Science Policy: the social sciences and the humanities” address by Hon Pete Hodgson to Stout Research Centre, Victoria University at Wellington.

Parton, H. (1979) University of New Zealand, Auckland University Press/Oxford University Press for the University Grants Committee, Auckland.

Polanyi, K. (1957) The Great Transformation, Beacon Press, Boston.

Robb, J.H. (1987) The Life and Death of Official Social Research in New Zealand 1936-40, Occasional Paper in Sociology and Social Work 7, Victoria University of Wellington.

Rothblatt, S. and B. Wittrock (1993) The European and American Universities since 1800: Historical and Sociological Essays, Cambridge University Press.

Rothschild, Lord (1971) “The organisation and management of government R. & D.” in the Green Paper A Framework for Government Research and Development (Cm 4814 November 1971) Her Majesty’s Stationery Office, London.

Schon, D.A. (1983) The Reflective Practitioner: How Professionals Think in Action, Basic Books, New York.

Shields, M. (1975) “Social research for a dynamic society” Australian and New Zealand Journal of Sociology, 10(3):230-2.

Smith, L. (1999) Decolonising Methodologies: Research and Indigenous Peoples, Otago University Press, Dunedin.

Smithies, R. and S. Bidrose (2000) “Debating a research agenda for children for the next five years” Social Policy Journal of New Zealand, Issue 15:41-54.

Social Science Reference Group (2001) Connections, Resources and Capacities: How Social Science Research Can Better Inform Social Policy Advice, Report from the Improving the Knowledge Base for Social Policy Reference Group, Wellington.

State Services Commission (1999) Looping the Loop: Improving the Quality of Policy Advice, State Services Commission, Wellington.

Timms, D.W.G. (1970) “The teaching of sociology in New Zealand” in J. Zubrzycki (ed.) Teaching of Sociology in Australia and New Zealand, Cheshire for the Sociological Association of Australia and New Zealand, Melbourne.

Tolich, M. (ed.) (2001) Research Ethics in Aotearoa New Zealand, Pearson Education, Auckland.

Tolich, M. and C. Davidson (1999) “Beyond Cartwright: observing ethics in small town New Zealand” New Zealand Sociology, 14(1):61-84.

Tolich, M. and Davidson, C. (1999) Starting Fieldwork: An Introduction to Qualitative Research in New Zealand, Oxford University Press, Melbourne.

Weiss, C. (ed.) (1977) Using Social Research in Public Policy Making, Lexington Press, Massachusetts.

Weiss, C. (1998) Evaluation: Methods for Studying Programs and Policies, 2nd edition, Prentice Hall, Upper Saddle River, New Jersey

White Paper (1972) Framework for Government Research and Development (Cm 5046 July 1972) Her Majesty’s Stationery Office, London.


Cover photo of Social Policy Journal

Documents

Social Policy Journal of New Zealand: Issue 18

Increasing social science research capacity: some supply-side considerations

Print this page.