Truths and trends of  America’s youth

 
 
 

The Generation Lab is a polling and research firm studying young people and the trends that shape their world.

 

We translate youth views and behavior for media, academia, businesses, government, and the American public. Whether through a one-day snapshot poll, or a multi-year longitudinal study, we pursue youth truth for our clients and the American public. 

 

Team

Matin Mirramezani

Chief Operating Officer

matin@generationlab.org

  • Twitter
  • Twitter

Sidhika Tripathee

Senior Web Developer

sidhika@generationlab.org

  • Twitter

Seth Goldstein

Chief Growth Officer

seth@generationlab.org

  • Twitter

Rebecca Oh

Methodology Strategist

rebecca@generationlab.org

  • Twitter

Cyrus Beschloss

Founder

cyrus@generationlab.org

  • Twitter

Bryan Woolley

Data Scientist 

bryan@generationlab.org

  • Twitter
  • Twitter

Charles Ide

Senior Web Developer

charles@generationlab.org

Shrijana Khanal

Senior Research Strategist

shrijana@generationlab.org

  • Twitter

Garrett O'Brien

Partnerships Coordinator

garrett@generationlab.org

  • Twitter

Alana Jennis

Research strategist​

  • Twitter

Board of Advisors

Larry Irving

  •  Principal, Irving Group

  •  Fmr. Vice President Global Government Affairs

  • Fmr. Assistant Secretary of Commerce for Communications and Information

Other boards served:

  • PBS 

  • Northwestern University 

  • Texas Tribune

Joe Dworetzky

Dr. Frederick Conrad

  •  Reporter and Cartoonist, Bay City News Foundation, Los Angeles Times

  • Fmr. City Solicitor for the City of Philadelphia under Mayor Ed Rendell

  • Fmr. Member of the Philadelphia School Reform Commission

Other boards served:

  • William Penn Foundation,

  • Pennsylvania Energy Development Authority

  • Director, Survey Methodology Program, University of Michigan

  • Research Professor, Joint Program in Survey Methodology, University of Maryland

Books:

  • Tourangeau, R., Conrad, F.G., Couper, M.P. (2013). The Science of Web Surveys. Oxford: Oxford University Press.

  • Conrad, F.G. & Schober, M.F. (Eds.) (2008). Envisioning the Survey Interview of the Future. New York: Wiley & Sons.

Recent grants:

  • “Surveying older populations using video communication technologies.” National Institute on Aging

  • “Collaborative Research: Video Communication Technologies in Survey Data Collection,” National Science Foundation Grant 

Dr. Emil Pitkin

  • Founder, GovPredict 

  • Won the Anvil Award as Wharton’s top instructor

  • Maverick PAC Under 40 leader

  • Serves on the board of the Government Relations Association

  • Has played Carnegie Hall

Recent publications:

  • Founder & CEO, OZY 

  • Has hosted shows on MSNBC, CNN, PBS, BBC and beyond

  • Host of The Carlos Watson Show

Other boards served:

  • PBS

  • NPR

Carlos Watson

 

Methodology Summary

 
 

The Generation Lab conducts ongoing surveys that measure attitudes and views of the American youth on current issues and policies. With technological advances and growing desires for privacy, computer devices have become a favorable method of completing surveys, especially among the younger generation (Erens et al., 2018; Gnambs & Kaspar, 2014; Mavletova & Couper, 2013). Web surveys have also shown to present a more comprehensive set of data than many other modes of data collection (Bowling, 2005). The Generation Lab’s approach is influenced by the aforementioned findings and relies heavily on web surveys to efficiently collect data nationwide, reduce costs, and minimize errors of manual data entry.

The Generation Lab generates samples that reflect the broader college demographic from a variety of perspectives. Our panel is built from a database of every college and university in the United States in order to conduct high-volume, customizable polling on various segments of the college demographic. We obtain this compilation of colleges and universities from the Department of Education College Scorecard, which includes “all active Integrated Postsecondary Education Data System (IPEDS) institutions that participate in Title IV programs (either by disbursing aid or through deferments) and that are not solely administrative offices” (U.S. Department of Education, 2020). This list of colleges and universities includes (but is not limited to) community colleges, historically black colleges and universities (HBCU), and women’s colleges. The list is further refined to accurately reflect the intended target population; thus, private for-profit schools and postgraduate-only institutions are explicitly removed from the frame.

The Generation Lab approaches colleges and universities in a random order to mitigate biases resulting from a non-random approach. The final frame used in our polling closely resembles a probability sample of adult men and women of [ages 18 to 24] enrolled as a college or university student in the United States.

To poll students, we utilize a random sample of students in our panel. We verify student status using email addresses and include additional screening in our questionnaire. Throughout the polling process, we utilize a variety of features to ensure the integrity of the data that we collect, including survey protocols preventing multiple responses and flagging invalid inputs. Surveys are deployed and sent out to individuals who satisfy the qualification of our intended frame. Once surveys are closed, our datasets historically include a total of roughly 800 to 4,000 completed surveys. An honorarium of varying value is provided to respondents who complete the survey. 

Weighting

After survey responses have been collected, the Generation Lab weights the results based on gender and race, using population statistics from the Department of Education/U.S. Census Bureau. Our goal is to make the distribution of the characteristics of student respondents match that of the target population by implementing post-stratification calibrated weighting. 

Our process involves determining sample proportions by partitioning survey respondents into strata of genders and strata of race and estimating the proportion of survey respondents within each stratum. Population proportions from [the Department of Education/U.S. Census Bureau/external administrative source] are also identified within each stratum, as well. The ratio of the population and sample proportions for each stratum are then used as the post-stratification adjustment factor. An example of calibrated weighting is shown in Example 1.

                               

                                                                        Example 1

 

 

 

 

 

 

 

 

Calibrated weights are applied to gender and race items of each survey observation. We calculate the product of gender and race weights of each observation to obtain the final weights. Example 2 provides an illustration of calculating final weights for a few rows of observations.

                                                                

Example 2

Analysis

After applying weights, we analyze results for each survey item using a variety of measures, including mean and standard deviation, in order to study the opinions and viewpoints of adult men and women of [ages 18 to 24] enrolled as a college or university student in the United States. In addition, we delve further in our analysis by segmenting each survey item by respondent characteristics (i.e. determining race distribution of each political party). Our statistical analysis also consists of studying specific response trends and patterns across various waves or iterations of similar surveys.

References

Bowling, A. (2005). Mode of questionnaire administration can have serious effects on data quality. Journal of Public Health, 27(3), 281–291. doi: 10.1093/pubmed/fdi031

Erens, B., Collins, D., Manacorda, T., Gosling, J., Mays, N. B., Reid, D., & Taylor, W. (2018). Comparing data quality from personal computers and mobile devices in an online survey among professionals. Social Research Practice, (7), 15–26.

Gnambs, T., & Kaspar, K. (2014). Disclosure of sensitive behaviors across self-administered survey modes: a meta-analysis. Behavior Research Methods, 47(4), 1237–1259. doi: 10.3758/s13428-014-0533-4

Mavletova, A., & Couper, M. P. (2013). Sensitive Topics in PC Web and Mobile Web Surveys: Is There a Difference? Survey Research Methods, 7(3), 191–205.

U.S. Department of Education (2020). Technical Documentation: College Scorecard Institution-Level Data. https://collegescorecard.ed.gov/assets/FullDataDocumentation.pdf

 

1133 Connecticut Ave. Washington D.C. 20036

  • Twitter
  • Instagram
  • LinkedIn

© 2020 The Generation Lab