Testimonials

Testimonials from Past Interns

Michigan Language Assessment’s internship program provides professional training and research opportunities for graduate students. Read what former interns have to say about their experiences in our program.

For questions about the internship program, email research@michiganassessment.org.

Winner of 2024 Best Student Paper Award from the Midwest Association of Language Testers

From June to August 2024, Burak Senel from Iowa State University undertook a rigorous three-month internship at Michigan Language Assessment, culminating in an extensive 85-page white paper on automated scoring system validation. His work in automated scoring of writing not only earned him significant recognition as the Best Student Paper Award from the Midwest Association of Language Testers (MwALT), but also marked a notable contribution to the field of language assessment.

Read the full blog!

 

“Professionally, Michigan Language Assessment greatly contributed to my development. The environment encouraged deep engagement with my work and offered many opportunities to interact with assessment stakeholders who were always receptive to my ideas and available for questions and feedback.”

Burak Senel

Summer 2024 Intern, Iowa State University

Burak Senel is a doctoral student in Applied Linguistics and Technology at Iowa State University. With a M.A. in Natural Language Processing and extensive teaching experience, he was an early adopter of generative AI in academic writing instruction, earning him media recognition. His writings for Sage and international speeches, spanning the U.S., U.K., Turkey, and Hong Kong, emphasize the ethical use of generative AI in research and education. His research focuses on automated scoring, and generative AI use in language assessment (e.g., interactive life-like situations and agents).

My internship at Michigan Language Assessment was an incredibly enriching experience that complemented my Ph.D. studies with practical, hands-on learning. Throughout my time there, I had the unique opportunity to participate in one-on-one meetings with various teams, from marketing to test security to item development, gaining a holistic understanding of how an assessment organization operates (and often finding myself in awe of how much thought and effort went into ensuring a smooth and collaborative operation). I also attended regular staff and standard-setting meetings, where I observed real-world challenges and solutions in action. A standout experience was attending a recording session for listening passages in a professional studio, where I saw firsthand the fun and meticulous behind-the-scenes work that goes into creating listening items.

Professionally, Michigan Language Assessment greatly contributed to my development. The environment encouraged deep engagement with my work and offered many opportunities to interact with assessment stakeholders who were always receptive to my ideas and available for questions and feedback. I presented three times to Michigan Language Assessment stakeholders, covering topics like automated scoring, my dissertation work on interactive academic listening with generative AI, and my Michigan Language Assessment project – a white paper on automated scoring quality control and assurance. These presentations allowed me to speak to stakeholders outside the classroom and boosted my confidence in applying theoretical knowledge to real-world scenarios and in communicating my work to a diverse assessment audience.

The support I received from the assessment team and other Michigan Language Assessment teams was exceptional. I felt like everyone went out of their way to make me feel welcome and included, whether it was by offering help whenever needed or inviting me to social outings like a trip to a local gelato shop or giving me tickets to a football (soccer) game. I even had the opportunity to show an interactive listening item from my dissertation work to the CEO of Michigan Language Assessment and hear her thoughts.

Ann Arbor, MI, has a magical, forest-like atmosphere, warm-hearted people, great food, and attractions like the University of Michigan Museum of Natural History and the University of Michigan Museum of Art. It has now earned a spot on my list of favorite cities in the U.S.

My time at Michigan Language Assessment was both educational and enjoyable. I’m deeply grateful to all the Michigan Language Assessment members for creating opportunities that allowed me to grow both personally and professionally.

Throughout the internship, I was continually impressed by the welcoming and supportive environment at Michigan Language Assessment and the window into the operational side of language testing. I was regularly involved in assessment team meetings, standard-setting sessions, and item review sessions.

Yulin Pan

Summer 2024 Intern, University of Illinois at Urbana-Champaign

Yulin Pan is a doctoral student in Linguistics at the University of Illinois at Urbana-Champaign. She received her M.A. in Applied Linguistics from the Teachers College, Columbia University, and B.A. in English Language and Literature from Northeastern University in China. Her research interests include speaking assessment, L2 speech fluency, corpus linguistics, and natural language processing.

When I received the offer to intern at Michigan Language Assessment, I was thrilled at the prospect of working on large-scale language testing and applying my academic knowledge in a real-world setting.  My main project, focused on analyzing writing rubrics and rater behavior, turned out to be an invaluable opportunity to delve deep into the complexities of language assessment and contribute to meaningful improvements in rater feedback and training.

I conducted a study using a combination of G Theory analysis and many-facet Rasch measurement (MFRM) analysis. This hands-on experience greatly enhanced my understanding of test validation processes and rater behavior and introduced me to new methodologies and research topics. I’m now fascinated by rater behavior, and I’m excited to explore it further in my future research endeavors. It was also a rewarding process to collaborate with trained raters and consult with the statistician and psychometrician. Their insights were crucial in understanding the nuances of rater performance.

Throughout the internship, I was continually impressed by the welcoming and supportive environment at Michigan Language Assessment and the window into the operational side of language testing. I was regularly involved in assessment team meetings, standard-setting sessions, and item review sessions. Additionally, meeting with the business and marketing teams also gave me valuable insights into the marketing and strategic promotion of language tests in international markets, an aspect of language assessment I had not previously considered. The sense of community at Michigan Language Assessment made my internship particularly special. It’s wonderful to sit and chat with people and learn about their journey to language teaching and testing and learning their career path.

Outside of work, I also enjoyed exploring the charming town of Ann Arbor, MI. Whether it was enjoying meals in the downtown restaurants, visiting the local farmers’ market, or attending the Summer Festival and Art Fair, every moment added to the richness of my experience. The balance of professional engagement and personal enjoyment made this summer truly memorable.

In conclusion, my internship at Michigan Language Assessment was everything I hoped it would be and more. It provided me with hands-on experience in language test development, broadened my research capabilities, and introduced me to a network of professionals who are passionate about their work. As I move forward in my career, I will carry with me the lessons learned and the connections made during this incredible summer.

“The summer internship at Michigan Language Assessment was a great opportunity for me to experience the day-to-day operations of a language testing organization and understand the collaborative effort that goes into developing and administering a standardized language test.”

Haeun Kim

Summer 2023 Intern, Iowa State University

Haeun (Hannah) Kim is a PhD student in the Applied Linguistics and Technology program with a minor in Statistics at Iowa State University. Before joining the program, she taught English for six years to middle and high school students in South Korea. Her research interests include language assessments for young learners, argument-based validation in testing, corpus linguistics, and language learning in the digital wilds. She is currently working on her dissertation, which focuses on the use of picture-based narrative writing tasks in WIDA’s ACCESS.

The summer internship at Michigan Language Assessment was a great opportunity for me to experience the day-to-day operations of a language testing organization and understand the collaborative effort that goes into developing and administering a standardized language test. In addition, I really enjoyed working on my research project, which gave me a chance to delve deeply into MET’s construct and domain relevance. It was rewarding to engage in a topic that aligns so closely with my research focus on validation in testing. My mentors, Mika and Tahnee, provided invaluable advice and support while I was working on the project. Other team members at Michigan Language Assessment were also very welcoming and were willing to share their knowledge and expertise with me whenever I came to them with questions. 

My previous language assessment internship was fully remote at the height of the pandemic, so working in a hybrid setting at Michigan Language Assessment in Ann Arbor was a very new and exciting experience for me. I was impressed by the productive and collaborative environment MLA has created through a balance between in-person and digital interactions. On the days I worked remotely, I often found myself at the University of Michigan’s libraries. Their architectural beauty, combined with the vast collection of books and resources available to me as an MLA staff member, made them a perfect sanctuary for work and research.

During the weekends, I was able to visit beautiful places in Michigan people had recommended, such as Mackinac Island, the Sleeping Bear Dunes National Lakeshore, and Traverse City. Attending local events like the Ann Arbor Summer Festival and the Ann Arbor Art Fair made my summer in Michigan even more memorable. I know I will miss the short walks and conversations I had with team members on our way to Argus and Blank Slate. I look forward to meeting team members again at future language testing conferences and in other professional settings.

“I am so grateful that I’ve met many great people here. Everyone was willing to share knowledge about the company and patient to answer questions.”

Huiying Cai

Summer 2023 Intern, University of Illinois Urbana-Champaign

Huiying Cai is a PhD student in Linguistics at the University of Illinois Urbana-Champaign (UIUC). Her research interests mainly lie in L2 language assessment, focusing on the interplay of language testing and computational linguistics. Her studies involve natural language processing techniques on test-relevant corpora and quantitative research methods to provide empirical evidence for test validity arguments.

When I applied for the summer internship at Michigan Language Assessment, a well-known language testing company, I was excited about the opportunity that I might gain much practical knowledge and experience in large-scale standardized language proficiency tests. That’s indeed what I learned when I got here. This truly has broadened my eyesight as I previously worked in a local test held in a university setting. It was also an interesting experience that I worked hybridly after the pandemic.

I was assigned a research project to find factors that affect item difficulty to help item writers/developers produce more difficult items. The project focused on discrete listening and grammar items on the MET. I first studied the literature to identify what factors the field has already found. Then, I extracted a wide array of textual features using NLP tools for items at different difficulty levels and performed a statistical analysis to determine which features tend to affect (i.e., increase or decrease) item difficulty and which ones contribute more than others. I enjoyed the whole process, having a chance to utilize my skill set to solve problems in the real world. The moment I heard from the team that my findings might help made my day.

In addition, I was invited to meetings with different teams, both online and in-person, where I had the opportunity to learn the overall structure of a testing company, the primary responsibilities of each team, and how those teams cooperate. This has allowed me to get some basic ideas about areas other than assessment, such as marketing, operation, and business management. It was particularly intriguing when people discussed the same topic from different perspectives.

I am so grateful that I’ve met many great people here. Everyone was willing to share knowledge about the company and patient to answer questions. Patrick and Jay gave me a lot of support on my research project throughout the whole process, encouraging me to ask for help. Tahnee was always there, caring about what I might be interested in and my feelings about working hybrid. She also gave me helpful suggestions on my project and future career. It was a wonderful experience working here.

During weekends, I immersed myself in adventuring the town. Ann Arbor is a lovely place full of excellent restaurants, cute markets and stores, and beautiful gardens. I was also in love with its diverse culture and jolly atmosphere.

“My mentors Sharon and Patrick were extremely patient and supportive. They were always available to answer my questions and direct me to the right resources. I would highly recommend this internship to anyone who is looking for short-term working experience and/or long-term research collaboration.”

Yiran Xu

Summer 2019 Intern, Georgetown University

Yiran Xu is a PhD candidate in Applied Linguistics at Georgetown University. She received her MA in Educational Studies from the University of Michigan – Ann Arbor and BA in English Language and Literature from Qingdao University in China. She conducts research in the fields of second language acquisition (SLA) and language testing with a primary focus on the development and assessment of L2 writing. She has been working for the Assessment and Evaluation Language Resource Center (AELRC) at Georgetown University since 2016, where she helped plan several language testing conferences and collaboratively developed a Mandarin C-test for research purposes. She is currently working on her dissertation “L2 Writing Complexity in Academic Legal Discourse: Development and Assessment under a Curricular Lens.”

Michigan Language Assessment (MLA) is a wonderful place for a summer internship, as it allows me to get a taste of large-scale international language assessment and apply my knowledge in SLA and language assessment to actual test development. It has been a truly enjoyable and rewarding experience.

During my internship, I was tasked with two assignments which required me to work both independently and collaboratively. The first assignment was to prepare a literature review that summarizes and synthesizes scholarship on training standard-setting panelists and propose a new standard-setting study. The second assignment was to work collaboratively with a Writing Revision Team to develop a new rating scale for ECPE writing and plan the pilot study and data analysis. Both assignments allowed me to actively engage with the most updated research and constantly explore the intersections between theory and practice in a large-scale assessment setting. While working on my assignments, I learned how to use FACETS in conducting multi-facet Rasch analysis with the help of Gad and Patrick. I also attended different test development meetings regularly where I gained numerous practical insights from my colleagues with different backgrounds and years of test development experience.

The MLA community is very welcoming and efficient. Thanks to their well-planned orientation program, I was able to quickly adjust myself to the new working environment without feeling overwhelmed. My mentors Sharon and Patrick were extremely patient and supportive. They were always available to answer my questions and direct me to the right resources. I would highly recommend this internship to anyone who is looking for short-term working experience and/or long-term research collaboration.

During my free time, I enjoyed chatting over lunch breaks and taking a walk in the neighborhood with my colleagues. It was also very nice to live in Ann Arbor again – I met with some old friends and went to the Summer Festival and the Art Fair which I didn’t have a chance to explore in the past.

“I am especially thankful to Gad for his guidance and support throughout the project. His extensive knowledge, expertise, and experience in language testing have all inspired me profoundly.”

Xiaowan Zhang

Summer 2019 Intern, Michigan State University

Xiaowan Zhang received her MA in TESOL from the University of Illinois at Urbana-Champaign and her BA in English from Wuhan University, China. Xiaowan is currently a PhD candidate in the Second Language Studies Program at Michigan State University, where she teaches introductory courses to TESOL minors. Her main research interests include language testing, language policy, and quantitative research methods.

When I received the internship offer from Michigan Language Assessment, I imagined with great excitement what it would be like to work in a well-known testing company. I dreamed of myself working on interesting projects, learning from testing professionals, and participating in the day-to-day business operations. Not only did all my dreams come true at Michigan Language Assessment, but I was also given much more.

My main task here was to develop a toolkit for investigating the impact of the Michigan English Test (MET) and its sister test for young learners, the MET Go!. This project provided me with an opportunity to further develop my research interest in test impacts. It required me to dig deep into the literature of test washback and impacts, to understand the constructs of the MET and MET Go!, and to create instruments that are appropriate for the specific context of South America, where the MET and MET Go! are currently being used. With the help of my mentor, Dr. Gad Lim, I also had the opportunity to pilot some of the survey questions with test takers in Costa Rica and Colombia. I am especially thankful to Gad for his guidance and support throughout the project. His extensive knowledge, expertise, and experience in language testing have all inspired me profoundly.

In addition to the impact project, I was involved in another research project that was aimed at investigating the cognitive diagnostic validity of the two-skill MET (listening and reading only). I worked with five assessment staff members to identify the skills that are necessary to answer the MET listening and reading items. I tagged two MET forms independently based on a skill list prepared by the team leader and discussed my tags with other members. Altogether, we resolved the differences in our tags and refined the original skill list through group discussion. This project has greatly expanded my knowledge of the application of structural equation modeling in test validation and has prompted me to explore the use of cognitive diagnostic modeling in my own research.

Aside from research experience, I also gained insight into the nuts and bolts of test development. I participated in all assessment staff meetings and content review sessions. Through these meetings and sessions, I learned a great deal about item writing, item review, and item analysis. I also had the opportunity to interview three business managers in South America and learned how tests are promoted to test takers and test centers in that region.

This internship program is the best way I can think of to spend my summer. All the staff members were warm and friendly and were eager to help me with any questions and concerns. Additionally, they gave me so many interesting ideas for spending my weekends that enabled me to maximize my stay in Ann Arbor. Thanks to them, I enjoyed every bit of my internship program. I will definitely miss the people here and Ann Arbor’s restaurants, peony garden, farmer’s market, civic concerts, summer festival, and art fair.

“I feel I not only learned a lot about language assessment, but I also met great people. The friendly environment where all staff members are eager to help each other and learn new things was one of the best parts of the internship.”

Senyung Lee

Summer 2018 Intern, Indiana University

Senyung Lee is a PhD candidate in second language studies at Indiana University, focusing on second language assessment. Her main research interests are testing L2 collocations and L2 writing assessment.

One of the main reasons I applied to intern at Michigan Language Assessment was to gain experience in quality assurance practices in a large-scale assessment context. Most of my previous hands-on experiences in language testing have been writing test specifications, writing items, and developing rubrics, but I have not worked with post-development phases. I was excited when I learned that Michigan Language Assessment provides opportunities to work on quality management of existing tests.

I was tasked with laying out possible revisions of one of the tests, and I learned a great deal about stakeholder interests and practical concerns regarding a large-scale international English test. This was invaluable experience for me because I learned to consider a big picture of the whole test administration rather than just focusing on test constructs and individual items at the micro level. In addition, interns were invited to staff meetings, and it allowed me to understand how staff members with different specialties work as a cohesive team.

Outside of work, I did my best to try different cuisines in Ann Arbor’s famous restaurants. Ann Arbor is such a vibrant city with great restaurants and coffee shops, and I will miss having a variety of options for food. I was also able to enjoy the Ann Arbor Summer Festival to the fullest.

My internship was only for eight weeks, but I feel I not only learned a lot about language assessment but also met great people. The friendly environment where all staff members are eager to help each other and learn new things was one of the best parts of the internship. I’d also like to thank Gad Lim and Rachel Basse for their support and very accommodating supervision they provided. This was an amazing opportunity for me to develop as a language tester.

“Overall, this internship was a valuable experience for me to grow professionally. And this experience was perfected by the beautiful, multicultural Ann Arbor, which I totally fell in love with.”

Phuong Nguyen

Summer 2018 Intern, Iowa State University

Phuong Nguyen received her MA in Applied Linguistics from the University of Melbourne, Australia. She taught academic English and English-for-Specific-Purposes courses to university-level students in Vietnam. Recently, she interned at the Center for Applied Linguistics with WIDA’s ACCESS 2.0 test development team and at Michigan Language Assessment working on MET Go! She is currently a PhD student in Applied Linguistics and Technology with a minor in statistics at Iowa State University where she also works as an assistant coordinator for the English Placement Test and instructor for an introductory linguistics course.

My future career goal is working for language testing agencies to develop language tests and investigate the extent to which language tests are well-designed to elicit valid, reliable, and relevant information about examinees. I applied for an internship at Michigan Language Assessment because I knew that it was a perfect place to experience firsthand how professionals collaborate in designing high-quality assessments and to broaden my perspective as a language tester/researcher.

During my time as an intern, I was involved in many projects related to the new MET Go! test. Most of my time was spent researching the use of checklists as a rating tool, examining the functionality of the rating tools for the speaking and writing tests, and helping test developers revise the rating tools and draft test development reports. These projects not only allowed me to apply what I had learned from my graduate programs, they also broadened my horizons. I learned many things from my mentors, Patrick and Gad, and other test developers, including knowledge about learning-oriented assessment and new R packages. I also appreciated being involved in various meetings with different test development teams, which helped me understand the enormous amount of time, organization, collaboration, and creativity needed before launching a new test. I also enjoyed the supportive working environment and the friendly staff at Michigan Language Assessment and the fact that our collaboration will extend beyond the internship.