Testimonials from Past Interns
Michigan Language Assessment’s internship program provides professional training and research opportunities for English language teaching and assessment professionals and for graduate students. Read what former interns have to say about their experiences in our program.
For questions about the internship program, email email@example.com.
“My mentors Sharon and Patrick were extremely patient and supportive. They were always available to answer my questions and direct me to the right resources. I would highly recommend this internship to anyone who is looking for short-term working experience and/or long-term research collaboration.”
Yiran Xu is a PhD candidate in Applied Linguistics at Georgetown University. She received her MA in Educational Studies from the University of Michigan – Ann Arbor and BA in English Language and Literature from Qingdao University in China. She conducts research in the fields of second language acquisition (SLA) and language testing with a primary focus on the development and assessment of L2 writing. She has been working for the Assessment and Evaluation Language Resource Center (AELRC) at Georgetown University since 2016, where she helped plan several language testing conferences and collaboratively developed a Mandarin C-test for research purposes. She is currently working on her dissertation “L2 Writing Complexity in Academic Legal Discourse: Development and Assessment under a Curricular Lens.”
Michigan Language Assessment (MLA) is a wonderful place for a summer internship, as it allows me to get a taste of large-scale international language assessment and apply my knowledge in SLA and language assessment to actual test development. It has been a truly enjoyable and rewarding experience.
During my internship, I was tasked with two assignments which required me to work both independently and collaboratively. The first assignment was to prepare a literature review that summarizes and synthesizes scholarship on training standard-setting panelists and propose a new standard-setting study. The second assignment was to work collaboratively with a Writing Revision Team to develop a new rating scale for ECPE writing and plan the pilot study and data analysis. Both assignments allowed me to actively engage with the most updated research and constantly explore the intersections between theory and practice in a large-scale assessment setting. While working on my assignments, I learned how to use FACETS in conducting multi-facet Rasch analysis with the help of Gad and Patrick. I also attended different test development meetings regularly where I gained numerous practical insights from my colleagues with different backgrounds and years of test development experience.
The MLA community is very welcoming and efficient. Thanks to their well-planned orientation program, I was able to quickly adjust myself to the new working environment without feeling overwhelmed. My mentors Sharon and Patrick were extremely patient and supportive. They were always available to answer my questions and direct me to the right resources. I would highly recommend this internship to anyone who is looking for short-term working experience and/or long-term research collaboration.
During my free time, I enjoyed chatting over lunch breaks and taking a walk in the neighborhood with my colleagues. It was also very nice to live in Ann Arbor again – I met with some old friends and went to the Summer Festival and the Art Fair which I didn’t have a chance to explore in the past.
“I am especially thankful to Gad for his guidance and support throughout the project. His extensive knowledge, expertise, and experience in language testing have all inspired me profoundly.”
Xiaowan Zhang received her MA in TESOL from the University of Illinois at Urbana-Champaign and her BA in English from Wuhan University, China. Xiaowan is currently a PhD candidate in the Second Language Studies Program at Michigan State University, where she teaches introductory courses to TESOL minors. Her main research interests include language testing, language policy, and quantitative research methods.
When I received the internship offer from Michigan Language Assessment, I imagined with great excitement what it would be like to work in a well-known testing company. I dreamed of myself working on interesting projects, learning from testing professionals, and participating in the day-to-day business operations. Not only did all my dreams come true at Michigan Language Assessment, but I was also given much more.
My main task here was to develop a toolkit for investigating the impact of the Michigan English Test (MET) and its sister test for young learners, the MET Go!. This project provided me with an opportunity to further develop my research interest in test impacts. It required me to dig deep into the literature of test washback and impacts, to understand the constructs of the MET and MET Go!, and to create instruments that are appropriate for the specific context of South America, where the MET and MET Go! are currently being used. With the help of my mentor, Dr. Gad Lim, I also had the opportunity to pilot some of the survey questions with test takers in Costa Rica and Colombia. I am especially thankful to Gad for his guidance and support throughout the project. His extensive knowledge, expertise, and experience in language testing have all inspired me profoundly.
In addition to the impact project, I was involved in another research project that was aimed at investigating the cognitive diagnostic validity of the two-skill MET (listening and reading only). I worked with five assessment staff members to identify the skills that are necessary to answer the MET listening and reading items. I tagged two MET forms independently based on a skill list prepared by the team leader and discussed my tags with other members. Altogether, we resolved the differences in our tags and refined the original skill list through group discussion. This project has greatly expanded my knowledge of the application of structural equation modeling in test validation and has prompted me to explore the use of cognitive diagnostic modeling in my own research.
Aside from research experience, I also gained insight into the nuts and bolts of test development. I participated in all assessment staff meetings and content review sessions. Through these meetings and sessions, I learned a great deal about item writing, item review, and item analysis. I also had the opportunity to interview three business managers in South America and learned how tests are promoted to test takers and test centers in that region.
This internship program is the best way I can think of to spend my summer. All the staff members were warm and friendly and were eager to help me with any questions and concerns. Additionally, they gave me so many interesting ideas for spending my weekends that enabled me to maximize my stay in Ann Arbor. Thanks to them, I enjoyed every bit of my internship program. I will definitely miss the people here and Ann Arbor’s restaurants, peony garden, farmer’s market, civic concerts, summer festival, and art fair.
“My internship was only for eight weeks, but I feel I not only learned a lot about language assessment but also met great people. The friendly environment where all staff members are eager to help each other and learn new things was one of the best parts of the internship.”
Senyung Lee is a PhD candidate in second language studies at Indiana University, focusing on second language assessment. Her main research interests are testing L2 collocations and L2 writing assessment.
One of the main reasons I applied to intern at Michigan Language Assessment was to gain experience in quality assurance practices in a large-scale assessment context. Most of my previous hands-on experiences in language testing have been writing test specifications, writing items, and developing rubrics, but I have not worked with post-development phases. I was excited when I learned that Michigan Language Assessment provides opportunities to work on quality management of existing tests.
I was tasked with laying out possible revisions of one of the tests, and I learned a great deal about stakeholder interests and practical concerns regarding a large-scale international English test. This was invaluable experience for me because I learned to consider a big picture of the whole test administration rather than just focusing on test constructs and individual items at the micro level. In addition, interns were invited to staff meetings, and it allowed me to understand how staff members with different specialties work as a cohesive team.
Outside of work, I did my best to try different cuisines in Ann Arbor’s famous restaurants. Ann Arbor is such a vibrant city with great restaurants and coffee shops, and I will miss having a variety of options for food. I was also able to enjoy the Ann Arbor Summer Festival to the fullest.
My internship was only for eight weeks, but I feel I not only learned a lot about language assessment but also met great people. The friendly environment where all staff members are eager to help each other and learn new things was one of the best parts of the internship. I’d also like to thank Gad Lim and Rachel Basse for their support and very accommodating supervision they provided. This was an amazing opportunity for me to develop as a language tester.
“Overall, this internship was a valuable experience for me to grow professionally. And this experience was perfected by the beautiful, multicultural Ann Arbor, which I totally fell in love with.”
Phuong Nguyen received her MA in Applied Linguistics from the University of Melbourne, Australia. She taught academic English and English-for-Specific-Purposes courses to university-level students in Vietnam. Recently, she interned at the Center for Applied Linguistics with WIDA’s ACCESS 2.0 test development team and at Michigan Language Assessment working on MET Go! She is currently a PhD student in Applied Linguistics and Technology with a minor in statistics at Iowa State University where she also works as an assistant coordinator for the English Placement Test and instructor for an introductory linguistics course.
My future career goal is working for language testing agencies to develop language tests and investigate the extent to which language tests are well-designed to elicit valid, reliable, and relevant information about examinees. I applied for an internship at Michigan Language Assessment because I knew that it was a perfect place to experience firsthand how professionals collaborate in designing high-quality assessments and to broaden my perspective as a language tester/researcher.
During my time as an intern, I was involved in many projects related to the new MET Go! test. Most of my time was spent researching the use of checklists as a rating tool, examining the functionality of the rating tools for the speaking and writing tests, and helping test developers revise the rating tools and draft test development reports. These projects not only allowed me to apply what I had learned from my graduate programs, they also broadened my horizons. I learned many things from my mentors, Patrick and Gad, and other test developers, including knowledge about learning-oriented assessment and new R packages. I also appreciated being involved in various meetings with different test development teams, which helped me understand the enormous amount of time, organization, collaboration, and creativity needed before launching a new test. I also enjoyed the supportive working environment and the friendly staff at Michigan Language Assessment and the fact that our collaboration will extend beyond the internship.