Neal Martin Kingston, Ph.D.

School of Education - Educational Psychology, Center for Educ Testing and Evaluation
University Distinguished Professor
Director, Assessment & Achievement Institute
Ph.D., Educational Measurement, Teachers College, Columbia University
Primary office:
785-864-9705
Joseph R. Pearson Hall
Room 605A
University of Kansas
1122 West Campus Rd
Lawrence, KS 66045-3101


Summary

Neal Kingston, Ph.D., came to the University of Kansas in 2006 and is a University Distinguished Professor in the Research, Evaluation, Measurement, and Statistics track of the Educational Psychology and Research Program and Director of the Achievement and Assessment Institute. His research focuses on large-scale assessment, with particular emphasis on how it can better support student learning through the use of learning maps and diagnostic classification models. He has been the principal investigator or co-principal investigator for over 200 research projects, including two of his current projects, Enhanced Learning Maps (https://enhancedlearningmaps.org/) and Project Lead The Way.

Dr. Kingston is known internationally for his work on large-scale assessment, formative assessment, and learning maps. He has served as a consultant or advisor for organizations such as the AT&T, Department of Defense Council on Military Personnel Testing, Edvantia, General Equivalency Diploma (GED), Kaplan, King Fahd University of Petroleum and Minerals, Merrill Lynch, National Council on Disability, Qeyas (Saudi Arabian National Center for Assessment in Higher Education), State of New Hampshire, State of Utah, and the U.S. Department of Education.

A hallmark of Dr. Kingston's career has been a focus on research management. His first such responsibilities were as Director of Selection Research for the Los Angeles County Department of Personnel where he and his staff were responsible for job analysis, construction of new testing material, revision of old testing material, review of appropriateness of old testing material for specific job classes, content validation, criterion related validation, construct validation, adverse impact analysis, setting cut scores, item and test bias studies, employee retention studies, and development and utilization of new technology (microcomputers, minicomputers, intelligent terminals, scanners, etc.).

As Director of Research and Test Development for the Graduate Records Examination (GRE) Program, Dr. Kingston was responsible for all administrative aspects of GRE research. He worked with the GRE Board and its Research Committee to develop and help execute long-term research plans. Among numerous committee and task force memberships at ETS, served on the Research Planning Committee on Technology and Testing and the Strategic Planning Committee on Alternative Test Delivery.

As Director of Research and New Testing Initiatives for Educational Testing Services' Graduate Records Examinations (GRE) Program, Dr. Kingston designed and administered an extensive research program that led to the development of models for future versions of the GRE including computerized adaptive testing and addition of a writing measure. He supported the GRE Board Research Committee to allow it to efficiently guide GRE Program research and development, and proper use of GRE tests.

As Associate Commissioner, Office of Curriculum, Assessment, and Accountability, Kentucky Department of Education, Dr. Kingston developed a research agenda to support Kentucky's assessment and curriculum goals. He chaired a review group of nationally prominent researchers from within and outside the state.

At Measured Progress, Dr. Kingston served as Senior Vice President and Chief Operating Officer. During this time he systematized and named what is now known as the Body of Work standard setting method and was among the first to implement cross-grade performance standards articulation.

Since 2009, as Director of the Center of Educational Testing and Evaluation and now Director of the Achievement and Assessment Institute, Dr. Kingston has served as principal investigator or co-principal investigator for more than 120 grants totaling about $200 million. Of particular note was the Dynamic Learning Maps Alternate Assessment, which was the largest grant in KU history and which currently serves 19 state departments of education. Other testing projects include the Kansas Assessment Program, Career Pathways Collaborative, and Adaptive Reading Motivation Measures. In his capacity as Director of the Achievement and Assessment Institute Dr. Kingston is responsible for five research centers with a staff of about 260 year-round staff and about 140 temporary employees.

Education

M.Phil., Educational Measurement, Teachers College, Columbia University

Ph.D., Educational Measurement, Teachers College, Columbia University

M.Ed., Educational Measurement, Teachers College, Columbia University

M.A., Psychology in Education, Teachers College, Columbia University

B.A., Liberal Studies (concentrations in Biology & Education), State University of New York

Teaching

Starting in 2019, Dr. Kingston's teaching will focus on advanced test theory and practice using an integrated framework that covers classical and modern approaches.

Prospective Graduate Students:

I am seeking one or two new doctoral students (previous masters degree not required) to start in fall 2019. Students will receive a graduate research assistant position that covers their tuition and fees and provides a salary (https://aai.ku.edu/sites/aai.ku.edu/files/docs/pdfs_general/GRA_Handbook...). Only candidates with very strong verbal and quantitative reasoning and communication skills will be considered. An interest in learning and applying test theory to better support student learning is of critical importance.

Teaching Interests

  • Educational Measurement theory and practice
  • Instructionally embedded assessment
  • Learning maps
  • Meta-analysis

Research

The field of education moves slowly. Theoretical improvements often take 30-50 years before wide-spread implementation. Often research-based practices are crowded out by the fad of the day. Making the challenge of improving education even greater, sub-disciplines within education far too often work in isolation. Simple solutions that do not address the complexity of individual students or the dynamics of a classroom at best have little impact and too often have a negative impact. Such has been the case in large-scale assessment where the use of assessment to drive curriculum and instruction has had numerous negative consequences.

Coming to the University of Kansas gave me the opportunity to consider the fundamental issues of education from a broader perspective. As have many others, I had long realized that thinking about curriculum, instruction, and assessment needs to be integrated. However, few researchers attempted to develop models or theories to do this. I was impressed by the efforts of some, particularly the research trajectories of Susan Embretson and Kikumi Tatsuoka, but I remained frustrated regarding how incomplete this work was and how little impact it was having on federally mandated state assessment programs. This led me to develop three conference presentations in 2009 that served to focus my thinking. The first was presented at the National Council on Measurement in Education and was entitled, "What Have We Learned about the Structure of Learning from 30 Years of Research on Integrated Cognitive-Psychometric Models? Not Much." The second was presented at the American Educational Research Association Conference – The Efficacy of Formative Assessment: A Meta-Analysis. The third, presented at the National Conference on Student Assessment, was entitled, "Large-Scale Formative Assessment: Panacea, Transitional Tool, or Oxymoron."

In 2010 an opportunity presented itself and allowed me to solidify my thinking. The US Department of Education issued a request for proposals to develop a large-scale assessment system for students with significant cognitive disabilities – the approximately one percent of students with the greatest learning challenges. It was clear to me that such an assessment system needed to do far more than measure learning – it needed to facilitate learning. I identified six features that needed to be present to do this. They are as follows.

1. Comprehensive fine-grained learning maps that guide instruction and assessment

2. A subset of particularly important nodes that serve as content standards to provide an organizational structure for teachers

3. Instructionally embedded assessments that reinforce the primacy of instruction

4. Instructionally relevant testlets that model good instruction and reinforce learning

5. Accessibility by design

6. Status and growth reporting that is readily actionable

No one had ever tried to develop a learning environment in this way. Comprehensive fine-grained learning maps did not exist. The concept of instructionally relevant assessment previously was unnamed and in its infancy. Clearly much research – both basic and applied – was necessary and this has become the focus of my research.

A related secondary research foci has remained important to me and is worthy of note. Test development and universal design have close ties to features 3-5 in the list above. I separate it as a research focus because it is also applicable to traditional testing programs.

Research Interests

  • Large-scale assessment
  • Computer-based testing
  • Diagnostic classification modeling
  • Learning maps
  • Test development
  • Score reporting

Selected Publications

Wang, W. & Kingston, N. M. (in press). Adaptive Testing with the Hierarchical Item Response Theory Model. Applied Psychological Measurement.

Embretson, S. E., & Kingston, N. M. (2018). Automatic Item Generation: A more Efficient Process for Developing Mathematics Achievement Items. Journal of Educational Measurement, 55(1), 112-131.

Atalmis, E. & Kingston, N. (2018). The Impact of Homogeneity of Answer Choices on Item Difficulty and Discrimination. Sage Open,(January - March 2018), 1-9.

Karvonen, M. Wakeman, S. & Kingston, N. M. (2017). Alternate Assessment. In M. Wehmeyer & K. Shogren (Eds.), Handbook of Research-Based Practices for Educating Students with Intellectual Disability. (pp. 102-118). New York: Routledge.

Clark, A. Nash, B. Karvonen, M. & Kingston, N. (2017). Condensed Mastery Profile Method for Setting Standards for Diagnostic Assessment Systems. Educational Measurement: Issues and Practice, 36(4), 5-15.

Kingston, N. Karvonen, M. Thompson, J. Wehmeyer, M. & Shogren, K. (2017). Fostering Inclusion of Students with Significant Cognitive Disabilities by Using Learning Map Models and Map-Based Assessments. Inclusion, 5(2), 110-120.

Bender, A. Hagan, K. & Kingston, N. (2017). The Association of Folate Levels and Depression: A Meta-Analysis. Journal of Psychiatric Research,(95), 9-18.

Kingston, N. & Broaddus, A. (2017). The Use of Learning Map Systems to Support Formative Assessment in Mathematics. Inclusion, 7(41). DOI:10.3390/educsci7010041

Atalmis, E. & Kingston, N. (2017). Three, four, and none of the above options in multiple-choice items. Turkish Journal of Education, 6(4), 143-157. DOI:10.19128/turje.333687

Kingston, N. Karvonen, M. Bechard, S. & Erickson, K. (2016). The Philosophical Underpinnings and Key Features of the Dynamic Learning Maps Alternate Assessment. Teachers College Record, 118(14).

Cho, H. J., & Kingston, N. M. (2015). Examining Teachers' Decisions on Test-Type Assignment for Statewide Assessments. Journal of Special Education, 49(1), 16-27.

Kingston, N. (2015). Shifting the Paradigm of Large-Scale Achievement Assessment. (M. Rice & E. Haaheim). Merrill Advanced Studies Center White Paper No. 119: Research Innovation as a Pathway to the Future. Lawrence, KS: University of Kansas Merrill Advanced Studies Center.

Kingston, N. M., Broaddus, A. & Lao, H. (2015). Some Thoughts on "Using Learning Progressions to Design Vertical Scales That Support Coherent Inferences About Student Growth.". Mearuement: Interdisciplinary Research & Perspectives, 13(3-4), 195-199.

Clark, A. K., & Kingston, N. M. (2014). A Brief History of Research on Test Fraud Detection and Prevention. In N. Martin Kingston & A. K. Clark (Eds.), Test Fraud: Statistical Detection and Methodology. New York: Routledge.

Tiemann, G. C., & Kingston, N. M. (2014). An Exploration of Answer Changing Behavior on a Computer-Based High Stakes Achievement Test. In N. Martin Kingston & A. K. Clark (Eds.), Test Fraud: Statistical Detection and Methodology. New York: Routledge.

Kingston, N. M., & Clark, A. K. (2014). Introduction. In N. Martin Kingston & A. K. Clark (Eds.), Test Fraud: Statistical Detection and Methodology. New York: Routledge.

Kingston, N. M., & Clark, A. K. (2014). Test Fraud: Statistical Detection and Methodology, New York: Routledge.

Adjei, S. Selent, D. Heffernan, N. Pardos, Z. Broaddus, A. & Kingston, N. M. (2014). Refining Learning Maps with Data Fitting Techniques: Searching for Better Fitting Learning Maps.. In Pardos & Stamper (Eds.) The 2014 Proceedings of International Educational Data Mining Society.

Popham, W. J., Berliner, D. C., Kingston, N. M., Fuhrman, S. H., Ladd, S. M., Charbonneau, J. & Chatterji, M. (2014). Can today's standardized tests yield instructionally useful data? Challenges, promises, and the state of the art. Quality Assurance in Education, 22(4), 300-316.

Ginsberg, R. & Kingston, N. M. (2014). Caught in a Vise: The Challenges Facing Teacher Preparation in an Era of Accountability. Teachers College Record, 116(1), 2014. http://www.tcrecord.org

Cho, H. J., & Kingston, N. M. (2014). UNDERSTANDING TEST-TYPE ASSIGNMENT: WHY DO SPECIAL EDUCATORS MAKE UNEXPECTED TEST-TYPE ASSIGNMENTS?: Alternate Assessment. Psychology in the Schools [00333085], 51(8), 866-878. DOI:10.1002/PITS.21783

Kingston, N. M. (2014). The value of standard setting: A personal meandering. NCME Newsletter, 22(4), 7.

Kingston, N. M. (2013). Educational Testing Case Studies. In J. Wollack & J. Fremer (Eds.), Handbook of Test Security. New York: Routledge.

Kingston, N. M., & Kramer, L. B. (2013). High Stakes Test Construction and Test Use. In T. Little (Ed.), Oxford Handbook of Quantitative Methods. Oxford University Press.

Kingston, N. M., Scheuring, S. T., & Kramer, L. B. (2013). Test Development Strategies. In K. Geisinger (Ed.), APA Handbook of Testing and Assessment in Psychology. Washington, DC: APA Books.

Kingston, N. M., Tiemann, G. C., & Loughran, J. T. (2013). Commentary on "Construct Maps as a Foundation for Standard Setting.". Measurement: Insterdisciplinary Research & Perspectives,(11), 181-184.

Cho, H. Wehmeyer, M. & Kingston, N. M. (2013). FACTORS THAT PREDICT ELEMENTARY EDUCATORS' PERCEPTIONS AND PRACTICE IN TEACHING SELF-DETERMINATION: Factors That Predict Educators' Practice
. Psychology in the Schools,(50), 770-780. DOI:10.1002/PITS.21707

Gu, F. Little, T. & Kingston, N. M. (2013). Misestimation of Reliability Using Coefficient Alpha and Structural Equation Modeling when Assumptions of Tau-Equivalence and Uncorrelated Errors are Violated. Methodology,(9), 30-40.

Kingston, N. M., & Anderson, G. (2013). The Efficacy of Using State Standards-Based Assessments for Predicting Student Success in Community College Classes. Educational Measurement: Issues and Practice, 32(3), 3-10.

Cho, H. & Kingston, N. M. (2013). Why IEP Teams Assign Low Performers With Mild Disabilities to the Alternate Assessment Based on Alternate Achievement Standards. Journal of Special Education,(47), 162-174.

Kingston, N. M., Tiemann, G. C., Miller, H. L., & Foster, D. (2012). An Analysis of the Discrete-Option Multiple Choice Item Type. Psychological Test and Assessment Modeling, 54, 3-20.

Cho, H. J., Lee, J. & Kingston, N. M. (2012). Examining the Effectiveness of Test Accommodation Using DIF and a Mixture IRT Model. Applied Measurement in Education, 25(4), 281-304.

Kingston, N. M., & Nash, B. (2012). How Many Formative Assessment Angels can Dance on the Head of a Meta-Analytic Pin: 2. Educational Measurement: Issues and Practice, 31(4), 18-19.

Cho, H. Wehmeyer, M. & Kingston, N. M. (2012). The Impact of Social and Classroom Ecological Factors on Promoting Self-Determination in Elementary School. Preventing School Failure, 56, 19-28.

Zheng, C. Erickson, A. G., Kingston, N. M., & Noonan, P. (2012). The Relationship among Self-Determination, Self-Concept, and Academic Achievement for Students with Learning Disabilities. Journal of Learning Disabilities, 46(2), 1-13.

Cho, H. J., & Kingston, N. M. (2012). Why Individualized Education Program Teams Assign Low-Performing Students with Mild Disabilities to the Alternate Assessment Alternate Achievement Standards. Journal of Special Education.

Kingston, N. M. (2012). It's 1938: A look back at the first year of NCME. NCME Newsletter, 20(3), 2-3.

Kingston, N. M. (2012). NCME Presidential Trivia. NCME Newsletter, 20(4), 6, 9.

Almond, P. Kingston, N. M., Michaels, H. Roeber, E. Warren, S. Winter, P. & Mark, C. (2012). Technical considerations for developing assessments that include special populations and are based on organized learning models (. & .). Menlo Park, CA and Lawrence, KS: SRI International and Center for Educational Testing and Evaluation.

Kingston, N. M., & Tiemann, G. C. (2011). Setting Performance Standards on Complex Assessments: The Body of Work Method. In G. J. Cizek (Ed.), Setting performance standards: Concepts, methods, and perspectives(2nd ed.). Mahwah, NJ: Lawrence Erlbaum.

Cho, H. & Kingston, N. M. (2011). Capturing Implicit Policy from NCLB Test Type Assignments: Analysis of Characteristics and Performance of Students Taking Alternate Assessments Based on Modified Achievement Standards. Exceptional Children, 78, 58-72.

Cho, H. Wehmeyer, M. & Kingston, N. M. (2011). Elementary Teachers’ Knowledge and Use of Interventions and Barriers to Promoting Student Self-Determination. Journal of Special Education, 45, 149-156.

Kingston, N. M., & Nash, B. (2011). Formative Assessment: A Meta-Analysis and a Call for Research. Educational Measurement: Issues and Practice, 30(4), 28-37.

Gu, F. Skorupski, W. Hoyle, L. & Kingston, N. M. (2011). Standard Errors and Confidence Intervals from Bootstrapping for Item Parameters in the Two- and Three- Parameter Logistic Ramsey Curve-IRT Models. Applied Psychological Measurement, 35, 568-571.

Kingston, N. M. (2010). Computerized adaptive testing. In N. Salkind (Ed.), Encyclopedia of Research Design. Thousand Oaks, CA: Sage Publishing.

Chen, J. Kingston, N. M., Tiemann, G. & Gu, F. (2010). Hypothesis. In N. Salkind (Ed.), Encyclopedia of Research Design. Thousand Oaks, CA: Sage Publishing.

Tiemann, G. C., & Kingston, N. M. (2010). Non-Directional hypothesis testing. In N. Salkind (Ed.), Encyclopedia of Research Design. Thousand Oaks, CA: Sage Publishing.

Gu, F. & Kingston, N. M. (2010). Scatterplot. In N. Salkind (Ed.), Encyclopedia of Research Design. Thousand Oaks, CA: Sage Publishing.

Kingston, N. M. (2010). Sequential Tests of Statistical Hypotheses (Wald) [The Annals of Mathematical Statistics, 16, 2, 117]. In N. Salkind (Ed.), Encyclopedia of Research Design. Thousand Oaks, CA: Sage Publishing.

Kingston, N. M., & Tiemann, G. C. (2010). Spearman Brown prophecy formula. In N. Salkind (Ed.), Encyclopedia of Research Design. Thousand Oaks, CA: Sage Publishing.

Bechard, S. Sheinker, J. Abell, R. Barton, K. Burling, K. Camacho, C. Cameto, R. Haertel, G. Hansen, E. Johnstone, C. Kingston, N. M., Murray, E. Parker, C. Redfield, D. Rodriquez, J. & Tucker, B. (2010). Measuring Cognition of Students with Disabilities Using Technology-Enabled Assessments: Recommendations for a National Research Agenda. The Journal of Technology, Learning, and Assessment, 10(4).

Bechard, S. Sheinker, J. Abell, R. Barton, K. Blackorby, J. Burling, K. Camacho, C. Cameto, R. Haertel, G. Hansen, E. Johnstone, C. Kingston, N. Murray, E. Parker, C. Redfield, D. Rodriquez, J. & Tucker, B. (2010). Measuring Cognition of Students with Disabilities Using Technology-Enabled Assessments: Recommendations for a Research Agenda. Dover, NH: Measured Progress, and Menlo Park, CA: SRI International.

Kingston, N. Chin, T. Geisinger, K. & McKinley, R. (2010). Final Report, Third Party Equating Verification of the NYS Grades 3-8 ELA and Mathematics Tests. Buros Institute for Assessment Consultation and Outreach.

Kingston, N. M. (2009). ACT College Admissions Test. In B. Kerr (Ed.), Encyclopedia of Giftedness, Creativity, and Talent. Thousand Oaks, CA: Sage Publishing.

Kingston, N. M. (2009). SAT. In B. Kerr (Ed.), Encyclopedia of Giftedness, Creativity, and Talent. Thousand Oaks, CA: Sage Publishing.

Kingston, N. M. (2009). Self-report instruments. In S. Lopez (Ed.), Encyclopedia of Positive Psychology. Malden, MA: Blackwell Publishing.

Kingston, N. M. (2009). Comparability of computer- and paper-administered multiple-choice tests for K-12 populations: a synthesis. Applied Measurement in Education, 22, 22-37.

Peyton, V. Kingston, N. M., Skorupski, W. Glasnapp, D. R., & Poggio, J. P. (2009). Kansas English Language Proficiency Assessment (KELPA) Technical Manual.

Irwin, P. M., Kingston, N. M., Skorupski, W. P., Glasnapp, D. R., & Poggio, J. P. (2009). Technical Manual for the Kansas Assessments in Science.

Kingston, N. M. (2008). Norm-Referenced tests. In N. Salkind (Ed.), Encyclopedia of Educational Psychology. Thousand Oaks, CA: Sage Publishing.

Kingston, N. M. (2008). Standardized tests. In N. Salkind (Ed.), Encyclopedia of Educational Psychology. Thousand Oaks, CA: Sage Publishing.

Irwin, P. M., Kingston, N. M., Glasnapp, D. R., & Poggio, J. P. (2008). Technical Manual for the Kansas Assessments of Modified Measures (KAMM) and Kansas Alternate Assessments (KAA) in Science.

Kingston, N. M. (2007). Future challenges to psychometrics: Validity, Validity, Validity. In C. R. Rao & S. Sinharay (Eds.), Handbook of Statistics, 26: Psychometrics. Amsterdam: The Netherlands: Elsevier.

Kingston, N. M., & Ehringhaus, M. (2005). Use of technology and principles of universal design to improve the validity and fairness of licensure tests. In J. L. Mounty & D. S. Martin (Eds.), Assessing Deaf Adults. Washington, DC: Gallaudet University Press.

Dings, J. Childs, R. & Kingston, N. M. (2002). Effects of matrix sampling on student score comparability in constructed response and multiple-choice assessments. Washington, DC: Council of Chief State School Officers.

Kingston, N. M., Kahl, S. R., Sweeney, K. P., & Bay, L. (2001). Setting performance standards using the body of work method. In G. J. Cizek (Ed.), Setting performance standards: Concepts, methods, and perspectives. Mahwah, NJ: Lawrence Erlbaum, Publishers.

Kingston, N. M., & Reidy, E. (1997). Kentucky’s accountability and assessment systems. In J. Millman (Ed.), Grading teachers, grading schools: Is student achievement a valid evaluation measure? . Thousand Oaks, CA: Corwin Publishers.

Kingston, N. M., & Reidy, E. (1997). KIRIS meets the critics: A little light and much heat. In J. Millman (Ed.), Grading teachers, grading schools: Is student achievement a valid evaluation measure?. Thousand Oaks, CA: Corwin Publishers.

Dings, J. Gong, B. & Kingston, N. M. (1995). KIRIS Accountability Cycle I Technical Manual. Frankfort, KY: Kentucky State Department of Education.

Kingston, N. M. (1993). Standard setting in compensatory versus non-compensatory licensure testing programs. CLEAR Exam Review, 4, 24-27.

Kingston, N. M., Leary, L. & Wightman, L. (1988). An exploratory study of the applicability of item response theory methods to the Graduate Management Admissions Test. Los Angeles: Graduate Management Admissions Council.

Schaeffer, G. A., & Kingston, N. M. (1988). RR 88-5. Strength of the analytical factor of the GRE General Test in several subgroups: a full information factor analysis approach. Princeton, NJ: Educational Testing Service.

McKinley, R. L., & Kingston, N. M. (1987). RR 87-21. Exploring the use of IRT equating for the GRE Subject Test in Mathematics. Princeton, NJ: Educational Testing Service.

Kingston, N. M., & Holland, P. W. (1986). GRE Board Professional Report 81-16P. Alternative methods of equating the GRE General Test. Princeton, NJ: Educational Testing Service.

Kingston, N. M. (1986). RR 86-13. Assessing the dimensionality of the GMAT Verbal and Quantitative Measures using full-information factor analysis. Princeton, NJ: Educational Testing Service.

Kingston, N. M., & Dorans, N. J. (1985). The analysis of item-ability regressions: An exploratory IRT model fit tool. Applied Psychological Measurement, 9, 281-288.

Dorans, N. J., & Kingston, N. M. (1985). The effects of violations of uni-dimensionality on the estimation of item and ability parameters and on item response theory equating of the GRE verbal scale. Journal of Educational Measurement, 22, 249-262.

Kingston, N. M., Leary, L. F., & Wightman, L. E. (1985). RR 85-34. An exploratory study of the applicability of item response theory methods to the Graduate Management Admission Test. Princeton, NJ: Educational Testing Service.

Kingston, N. M., & Dorans, N. J. (1984). Item location effects and their implications for IRT equating and adaptive testing. Applied Psychological Measurement, 8, 147-154.

Kingston, N. M., & Turner, N. J. (1984). GRE Board Professional Report 83-5P. Analysis of score change patterns of examinees repeating the Graduate Record Examinations General Test. Princeton, NJ: Educational Testing Service.

Kingston, N. M., & Dorans, N. J. (1982). GRE Board Professional Report 79-12bP. The effect of the position of an item within a test on item responding behavior: an analysis based on item response theory. Princeton, NJ: Educational Testing Service.

Kingston, N. M., & Dorans, N. J. (1982). GRE Board Professional Report 79-12P. The feasibility of using item response theory as a psychometric model for the GRE Aptitude Test. Princeton, NJ: Educational Testing Service.

Kingston, N. M., & Livingston, S. A. (1981). Effectiveness of the Graduate Record Examinations for predicting first year grades: 1979-80 summary report of the Graduate Record Examinations Validity Study Service. Princeton, NJ: Educational Testing Service.

» Show All Publications


The specialist program in school psychology is approved by the National Association of School Psychologists
School of Education is 10th among public universities for its master’s and doctoral programs
—U.S. News & World Report
Doctoral programs in school psychology and counseling psychology are accredited by the American Psychological Association
Involved in $22 million project to create new testing system for special education students
Robert Harrington is creating a certification program in bullying prevention for educators
KU Today
Connect with KU Educational Psychology

KU School of Education Facebook page

KU School of Education YouTube Channel

KU School of Education Twitter Feed

KU School of Ed instagram icon

KU School of Ed LinkedIn icon