Testimonials
Testimonials from Past Interns
Michigan Language Assessment’s internship program provides professional training and research opportunities for English language teaching and assessment professionals and for graduate students. Read what former interns have to say about their experiences in our program.
For questions about the internship program, email research@michiganassessment.org.
Haeun (Hannah) Kim is a PhD student in the Applied Linguistics and Technology program with a minor in Statistics at Iowa State University. Before joining the program, she taught English for six years to middle and high school students in South Korea. Her research interests include language assessments for young learners, argument-based validation in testing, corpus linguistics, and language learning in the digital wilds. She is currently working on her dissertation, which focuses on the use of picture-based narrative writing tasks in WIDA’s ACCESS.
The summer internship at Michigan Language Assessment was a great opportunity for me to experience the day-to-day operations of a language testing organization and understand the collaborative effort that goes into developing and administering a standardized language test. In addition, I really enjoyed working on my research project, which gave me a chance to delve deeply into MET’s construct and domain relevance. It was rewarding to engage in a topic that aligns so closely with my research focus on validation in testing. My mentors, Mika and Tahnee, provided invaluable advice and support while I was working on the project. Other team members at Michigan Language Assessment were also very welcoming and were willing to share their knowledge and expertise with me whenever I came to them with questions.
My previous language assessment internship was fully remote at the height of the pandemic, so working in a hybrid setting at Michigan Language Assessment in Ann Arbor was a very new and exciting experience for me. I was impressed by the productive and collaborative environment MLA has created through a balance between in-person and digital interactions. On the days I worked remotely, I often found myself at the University of Michigan’s libraries. Their architectural beauty, combined with the vast collection of books and resources available to me as an MLA staff member, made them a perfect sanctuary for work and research.
During the weekends, I was able to visit beautiful places in Michigan people had recommended, such as Mackinac Island, the Sleeping Bear Dunes National Lakeshore, and Traverse City. Attending local events like the Ann Arbor Summer Festival and the Ann Arbor Art Fair made my summer in Michigan even more memorable. I know I will miss the short walks and conversations I had with team members on our way to Argus and Blank Slate. I look forward to meeting team members again at future language testing conferences and in other professional settings.
“I am so grateful that I’ve met many great people here. Everyone was willing to share knowledge about the company and patient to answer questions.”
Huiying Cai is a PhD student in Linguistics at the University of Illinois Urbana-Champaign (UIUC). Her research interests mainly lie in L2 language assessment, focusing on the interplay of language testing and computational linguistics. Her studies involve natural language processing techniques on test-relevant corpora and quantitative research methods to provide empirical evidence for test validity arguments.
When I applied for the summer internship at Michigan Language Assessment, a well-known language testing company, I was excited about the opportunity that I might gain much practical knowledge and experience in large-scale standardized language proficiency tests. That’s indeed what I learned when I got here. This truly has broadened my eyesight as I previously worked in a local test held in a university setting. It was also an interesting experience that I worked hybridly after the pandemic.
I was assigned a research project to find factors that affect item difficulty to help item writers/developers produce more difficult items. The project focused on discrete listening and grammar items on the MET. I first studied the literature to identify what factors the field has already found. Then, I extracted a wide array of textual features using NLP tools for items at different difficulty levels and performed a statistical analysis to determine which features tend to affect (i.e., increase or decrease) item difficulty and which ones contribute more than others. I enjoyed the whole process, having a chance to utilize my skill set to solve problems in the real world. The moment I heard from the team that my findings might help made my day.
In addition, I was invited to meetings with different teams, both online and in-person, where I had the opportunity to learn the overall structure of a testing company, the primary responsibilities of each team, and how those teams cooperate. This has allowed me to get some basic ideas about areas other than assessment, such as marketing, operation, and business management. It was particularly intriguing when people discussed the same topic from different perspectives.
I am so grateful that I’ve met many great people here. Everyone was willing to share knowledge about the company and patient to answer questions. Patrick and Jay gave me a lot of support on my research project throughout the whole process, encouraging me to ask for help. Tahnee was always there, caring about what I might be interested in and my feelings about working hybrid. She also gave me helpful suggestions on my project and future career. It was a wonderful experience working here.
During weekends, I immersed myself in adventuring the town. Ann Arbor is a lovely place full of excellent restaurants, cute markets and stores, and beautiful gardens. I was also in love with its diverse culture and jolly atmosphere.
“My mentors Sharon and Patrick were extremely patient and supportive. They were always available to answer my questions and direct me to the right resources. I would highly recommend this internship to anyone who is looking for short-term working experience and/or long-term research collaboration.”
Yiran Xu is a PhD candidate in Applied Linguistics at Georgetown University. She received her MA in Educational Studies from the University of Michigan – Ann Arbor and BA in English Language and Literature from Qingdao University in China. She conducts research in the fields of second language acquisition (SLA) and language testing with a primary focus on the development and assessment of L2 writing. She has been working for the Assessment and Evaluation Language Resource Center (AELRC) at Georgetown University since 2016, where she helped plan several language testing conferences and collaboratively developed a Mandarin C-test for research purposes. She is currently working on her dissertation “L2 Writing Complexity in Academic Legal Discourse: Development and Assessment under a Curricular Lens.”
Michigan Language Assessment (MLA) is a wonderful place for a summer internship, as it allows me to get a taste of large-scale international language assessment and apply my knowledge in SLA and language assessment to actual test development. It has been a truly enjoyable and rewarding experience.
During my internship, I was tasked with two assignments which required me to work both independently and collaboratively. The first assignment was to prepare a literature review that summarizes and synthesizes scholarship on training standard-setting panelists and propose a new standard-setting study. The second assignment was to work collaboratively with a Writing Revision Team to develop a new rating scale for ECPE writing and plan the pilot study and data analysis. Both assignments allowed me to actively engage with the most updated research and constantly explore the intersections between theory and practice in a large-scale assessment setting. While working on my assignments, I learned how to use FACETS in conducting multi-facet Rasch analysis with the help of Gad and Patrick. I also attended different test development meetings regularly where I gained numerous practical insights from my colleagues with different backgrounds and years of test development experience.
The MLA community is very welcoming and efficient. Thanks to their well-planned orientation program, I was able to quickly adjust myself to the new working environment without feeling overwhelmed. My mentors Sharon and Patrick were extremely patient and supportive. They were always available to answer my questions and direct me to the right resources. I would highly recommend this internship to anyone who is looking for short-term working experience and/or long-term research collaboration.
During my free time, I enjoyed chatting over lunch breaks and taking a walk in the neighborhood with my colleagues. It was also very nice to live in Ann Arbor again – I met with some old friends and went to the Summer Festival and the Art Fair which I didn’t have a chance to explore in the past.
“I am especially thankful to Gad for his guidance and support throughout the project. His extensive knowledge, expertise, and experience in language testing have all inspired me profoundly.”
Xiaowan Zhang received her MA in TESOL from the University of Illinois at Urbana-Champaign and her BA in English from Wuhan University, China. Xiaowan is currently a PhD candidate in the Second Language Studies Program at Michigan State University, where she teaches introductory courses to TESOL minors. Her main research interests include language testing, language policy, and quantitative research methods.
When I received the internship offer from Michigan Language Assessment, I imagined with great excitement what it would be like to work in a well-known testing company. I dreamed of myself working on interesting projects, learning from testing professionals, and participating in the day-to-day business operations. Not only did all my dreams come true at Michigan Language Assessment, but I was also given much more.
My main task here was to develop a toolkit for investigating the impact of the Michigan English Test (MET) and its sister test for young learners, the MET Go!. This project provided me with an opportunity to further develop my research interest in test impacts. It required me to dig deep into the literature of test washback and impacts, to understand the constructs of the MET and MET Go!, and to create instruments that are appropriate for the specific context of South America, where the MET and MET Go! are currently being used. With the help of my mentor, Dr. Gad Lim, I also had the opportunity to pilot some of the survey questions with test takers in Costa Rica and Colombia. I am especially thankful to Gad for his guidance and support throughout the project. His extensive knowledge, expertise, and experience in language testing have all inspired me profoundly.
In addition to the impact project, I was involved in another research project that was aimed at investigating the cognitive diagnostic validity of the two-skill MET (listening and reading only). I worked with five assessment staff members to identify the skills that are necessary to answer the MET listening and reading items. I tagged two MET forms independently based on a skill list prepared by the team leader and discussed my tags with other members. Altogether, we resolved the differences in our tags and refined the original skill list through group discussion. This project has greatly expanded my knowledge of the application of structural equation modeling in test validation and has prompted me to explore the use of cognitive diagnostic modeling in my own research.
Aside from research experience, I also gained insight into the nuts and bolts of test development. I participated in all assessment staff meetings and content review sessions. Through these meetings and sessions, I learned a great deal about item writing, item review, and item analysis. I also had the opportunity to interview three business managers in South America and learned how tests are promoted to test takers and test centers in that region.
This internship program is the best way I can think of to spend my summer. All the staff members were warm and friendly and were eager to help me with any questions and concerns. Additionally, they gave me so many interesting ideas for spending my weekends that enabled me to maximize my stay in Ann Arbor. Thanks to them, I enjoyed every bit of my internship program. I will definitely miss the people here and Ann Arbor’s restaurants, peony garden, farmer’s market, civic concerts, summer festival, and art fair.
“My internship was only for eight weeks, but I feel I not only learned a lot about language assessment but also met great people. The friendly environment where all staff members are eager to help each other and learn new things was one of the best parts of the internship.”
Senyung Lee is a PhD candidate in second language studies at Indiana University, focusing on second language assessment. Her main research interests are testing L2 collocations and L2 writing assessment.
One of the main reasons I applied to intern at Michigan Language Assessment was to gain experience in quality assurance practices in a large-scale assessment context. Most of my previous hands-on experiences in language testing have been writing test specifications, writing items, and developing rubrics, but I have not worked with post-development phases. I was excited when I learned that Michigan Language Assessment provides opportunities to work on quality management of existing tests.
I was tasked with laying out possible revisions of one of the tests, and I learned a great deal about stakeholder interests and practical concerns regarding a large-scale international English test. This was invaluable experience for me because I learned to consider a big picture of the whole test administration rather than just focusing on test constructs and individual items at the micro level. In addition, interns were invited to staff meetings, and it allowed me to understand how staff members with different specialties work as a cohesive team.
Outside of work, I did my best to try different cuisines in Ann Arbor’s famous restaurants. Ann Arbor is such a vibrant city with great restaurants and coffee shops, and I will miss having a variety of options for food. I was also able to enjoy the Ann Arbor Summer Festival to the fullest.
My internship was only for eight weeks, but I feel I not only learned a lot about language assessment but also met great people. The friendly environment where all staff members are eager to help each other and learn new things was one of the best parts of the internship. I’d also like to thank Gad Lim and Rachel Basse for their support and very accommodating supervision they provided. This was an amazing opportunity for me to develop as a language tester.
“Overall, this internship was a valuable experience for me to grow professionally. And this experience was perfected by the beautiful, multicultural Ann Arbor, which I totally fell in love with.”
Phuong Nguyen received her MA in Applied Linguistics from the University of Melbourne, Australia. She taught academic English and English-for-Specific-Purposes courses to university-level students in Vietnam. Recently, she interned at the Center for Applied Linguistics with WIDA’s ACCESS 2.0 test development team and at Michigan Language Assessment working on MET Go! She is currently a PhD student in Applied Linguistics and Technology with a minor in statistics at Iowa State University where she also works as an assistant coordinator for the English Placement Test and instructor for an introductory linguistics course.
My future career goal is working for language testing agencies to develop language tests and investigate the extent to which language tests are well-designed to elicit valid, reliable, and relevant information about examinees. I applied for an internship at Michigan Language Assessment because I knew that it was a perfect place to experience firsthand how professionals collaborate in designing high-quality assessments and to broaden my perspective as a language tester/researcher.
During my time as an intern, I was involved in many projects related to the new MET Go! test. Most of my time was spent researching the use of checklists as a rating tool, examining the functionality of the rating tools for the speaking and writing tests, and helping test developers revise the rating tools and draft test development reports. These projects not only allowed me to apply what I had learned from my graduate programs, they also broadened my horizons. I learned many things from my mentors, Patrick and Gad, and other test developers, including knowledge about learning-oriented assessment and new R packages. I also appreciated being involved in various meetings with different test development teams, which helped me understand the enormous amount of time, organization, collaboration, and creativity needed before launching a new test. I also enjoyed the supportive working environment and the friendly staff at Michigan Language Assessment and the fact that our collaboration will extend beyond the internship.