Advertisements

Wise Running

Train smart, eat well, and enjoy the run.

EDUC 751 – Assignment 4

1  – Watch videos!

With this post, we begin to explore only your quantitative questions.  We will continue the qualitative in later assignments.

On Assignment 3, you saw that your questions were much more complicated than the way you had originally stated them.  There are many variables that might affect the outcomes.  Those other variables are important to your study.  In future assignments, we will look at how to either control for those extra variables or include them in the question.

For today, we want to examine sampling and descriptive statistics related to the variables in your quantitative questions.

In the textbook, definitions are given for all of these things.  This post is about playing around with those ideas in a brainstorming session specific to your questions and your context.

2.  Add a comment to this discussion post (10 points)

  • Restate your QUANTITATIVE RESEARCH QUESTIONS.  You may choose to update them based on what you learned in post 3, but it is not required.
  • For both questions, identify the population to which you would like to generalize the results of your study.
  • Then identify the representative samples you would study in order to complete the study.
  • Identify how you will measure your DVs.
  • Identify how you would measure your IVs.
  • Brainstorm a list of demographic info (descriptive statistics) that you would need to understand about your sample.  This data is needed to show how well your sample represents the population.

3.  Comment on the work of others with suggestions, feedback, and/or questions. (6 points total for at least 3 such comments)

DR. TAYLOR’S EXAMPLE – 

Questions:  Does an increase in collegial interaction among high school mathematics teachers lead to increased scores on the TN end of course tests?

Population:  All high school mathematics teachers and their students across Tennessee.

Sample:  A stratified sample of at least 100 teachers from rural, suburban and urban schools of various sizes – and their students.

Measurement of IV:  Collegial interaction will be quantified using the Taylor-Angelle Collegiality Scales.  (The validity and reliability of these scales were established in a previous study.)

Measurement of DV:  The EOC test are already quantitative and available upon request.  (The validity and reliability of these test were established in a study by the test authors.)

Demographics:

  • school size, locale, school diversity data with SES, gender, race
  • teacher data including diversity, education level, experience level
Advertisements

157 responses to “EDUC 751 – Assignment 4

  1. Mindy Volk May 25, 2018 at 2:58 pm

    Qualities of highly effective teachers.

    a.) After extensive PLCs on the TEAM rubric training (IV), what percentage of teachers (DV) scored a 4 or higher on level of effectiveness?

    Population: 9th grade teachers at Jefferson County High School

    Sample: A purposive sampling of freshman teachers at Jefferson County High School.
    Measurement: Teachers will be measured on an interval scale using the TEAM rubric and using the TVAAS data.

    Demographics: School size, school diversity data with SES, gender, ethnicity, teacher’s years of experience, teacher’s tenure at his/her current school

    b.) Do students’ test scores improve (DV) when a teacher’s classroom management is consistent (IV)?

    Population: 9th grade students and teachers at Jefferson County High School

    Sample: A purposive sampling of freshman students and teachers at Jefferson County High School.

    Measurement: Students test scores will be measured on an interval scale using the standardized test called TNReady.

    Demographics: School size, school diversity data with SES, gender, ethnicity, teacher’s years of experience, teacher’s tenure at his/her current school, academic level of students

    • Chad Lee May 25, 2018 at 5:17 pm

      I do not know how big your school is, but my first thought was that the sample may be too small to gather data that can reflect the county and/or state. Maybe you should expand the sample to 9th-grade students in the county so you get a wide variety of students with a larger background. This also increases the percentage of teachers that you are studying. If the population is too small or too isolated then the data may not be as widely accepted. I am interested in the classroom management component of these research though. Good luck!

    • Teresa Kirkland May 26, 2018 at 4:21 pm

      Mindy,
      Depending on your school size, is the sampling a large enough number for reliability? If you do have to increase the numbers, would you be able to add additional schools that have 9th grade?

    • Wes Anderson May 26, 2018 at 7:52 pm

      Mindy,
      I like the idea of gauging the impact of consistent classroom management, as I think there could be a correlation. How will you measure how consistent a teacher’s classroom management really is, though?

    • Kelsey Walker May 27, 2018 at 11:09 pm

      Hey Mindy! Glad to be back in a class with you!
      My topic is touching on classroom management as well and I am finding that to get a measurement tool for management is rather difficult. I have decided to define management based on teachers’ written plans (I am focusing on positive plans versus punitive plans), and the outcome of their plans is measured on their number of referrals. Ultimately, I think we have to figure out what keeps kids IN the classrooms so they are receiving the instruction necessary to improve scores.
      I think it will benefit you for a quantitative study to determine a way that you want to measure consistency for the management portion. That way, you can have a definable correlation between something specific the teacher does and their students’ test scores.
      I cannot wait to read more about what you find!

  2. Chad Lee May 25, 2018 at 5:11 pm

    Quantitative

    Question: What is the impact of students that attend single-sex schools (IV) scores on the ACT (DV) in comparison to coed education?

    Population: All students in the United States in private schools (Single-sex and coed)

    Sample: Stratified sample of at least 100 students divided equally between various single-sex and coed private schools.

    Measurement: ACT scores will be requested from schools and compared.

    Demographics: School size, location, cost of enrollment, gender, race

    Question: What is the impact of students that attend single-sex schools (IV) getting accepted (DV) into private institutions as compared to coed schools?

    Population: All students in the United States in private schools (Single-sex and coed)

    Sample: Stratified sample of at least 100 students divided equally between various single-sex and coed private schools.

    Measurement: College Acceptances will be requested from schools and compared.

    Demographics: School size, location, cost of enrollment, gender, race, Particular College, College acceptance rate from Tennessee

    • Melissa Jolley May 25, 2018 at 9:35 pm

      I’m curious why you are choosing private schools? Are there any public schools that separate by gender? Is that even legal?
      I am also curious if parental investment has more baring on student achievement than the actual act of separating the genders. Obviously the parents chose a gender separate school for a reason and they are willing to fork over the money, so I am just wondering if that aspect is more powerful than simply separating gender.
      If you could find enough public schools that offer gender separated instruction, it might eliminate the weight of parental influence.

      • Chad Lee May 26, 2018 at 11:04 am

        There are a few single-sex public schools but bit many. There is an all-girls leadership academy that is a charter and an all-boys charter opening up next year near me. I eventually want to use this data to see if it would work in the public sector but right now I feel the population is too small to only do public schools.

    • Valerie Orfield May 26, 2018 at 12:52 pm

      Hi Chad,
      I am also left wondering if there are public schools that are gender-specific, and (like Melissa) if that is even legal. I believe you will get very specific quantitative data from your study. What are you hoping to discover and how will your results be used? Interesting topic, and thanks for sharing.
      Kindly,
      Valerie

    • Clint Epley May 27, 2018 at 3:43 pm

      Chad,

      I have wondered this before, especially at certain ages and in certain subjects. Are you going to narrow your search to specific grade levels or subject areas? These schools may only be 9-12 but if its wide spread, say 6-12, will you focus on a specific area? I teach 8th grade and would be curious to see how an 8th grader would perform in this setting. I can also see certain subject areas benefiting, specifically English and / or Reading. SES of these students could also be an interesting portion of the demographics to understand the effect it has on their ability to succeed.

      I like your topic and look forward to what you find.

      Clint

    • Carol Powell May 28, 2018 at 11:24 am

      Hi Chad,
      Is there and expectation that students attending single-sex schools have higher ACT scores and college acceptance rates? Are you going to narrow your sample to specific grade levels and/or geographic area? I will be interested to see what the research reveals about single-sex schools.

  3. Lori Hill May 25, 2018 at 7:31 pm

    Question 1: What are the effects of standards-based grading [IV] on student achievement [DV] on TNReady English I and English II End of Course exams [DV]?

    Population: All high school English I and English II teachers and their students in Sevier County, Tennessee.

    Sample: A purposive sampling of at least 10 teachers from English I and English II in Sevier County, Tennessee.

    Measurement: Teachers utilizing standards-based grading will be measured against TNReady scores on an interval scale. The EOC tests are already quantitative and available on the report card on the Tennessee Department of Education’s website.

    Demographics:

    Student data including school size, school locale, gender, race, class period, SES

    Teacher data including education level, years of experience, number of professional development days attended, mandatory vs. voluntary implementation of standards-based grading, age

    Question 2: What is the relationship between standards-based grading [IV] on English I and English II TNReady End of Course exams [DV] and teachers’ years of experience [DV]?

    Population: All high school English I and English II teachers and their students in Sevier County, Tennessee.

    Sample: A purposive sampling of at least 10 teachers from English I and English II in Sevier County, Tennessee.

    Measurement: Teachers utilizing standards-based grading will be measured against years of experience on an interval scale. The EOC tests are already quantitative and available on the report card on the Tennessee Department of Education’s website.

    Demographics:

    Student data including school size, school locale, gender, race, class period, SES

    Teacher data including education level, years of experience, number of professional development days attended, mandatory vs. voluntary implementation of standards-based grading, age

    • Chad Lee May 26, 2018 at 11:08 am

      are teachers going to be randomly chosen from various schools in the county? I would suggest having a variety of teachers (age ranges, experience, gender, race, etc) and from various schools that provide a wider range of students as well. I think the purpose of this research is to get a grasp of the county in general from the data provided. Good luck

    • Valerie Orfield May 26, 2018 at 1:11 pm

      Hi Lori,
      Forgive my ignorance, but what is “standards-based” grading? Or is this referring to student scores on EOCs? Or are these their grades throughout the year, and then you’re comparing it to their scores on the EOCs? I am an elementary teacher and this is new to me. Regards, Valerie

    • Laura Mason May 27, 2018 at 2:50 pm

      Will you study factor in the testing issues experienced by high school students?

    • Clint Epley May 27, 2018 at 3:33 pm

      What are the variety of teachers in the sample (experience, age, etc.) ? This is interesting because standards based grading is something I have flirted with.
      Looking forward to your findings,
      Clint

  4. Lori Hill May 25, 2018 at 7:32 pm

    Hello, Sonya.

    I am struggling to understand all of the information from our text, but for your measurement, are you going to use one or more of the specific measurements defined in our text? I chose an interval scale to compare my independent variables to my dependent variables, but I by no means know if I chose correctly. In reviewing your questions, you may want to consider reviewing some of the quantitative scales of measurement (ordinal, nominal, ratio, etc.) to see what might apply to your research.

    Kind regards,

    Lori Hill

    Hello, Teresa.

    For your demographics, might you also consider parental involvement/ support and how that affects truancy? I would also add, as I did in a previous classmate’s reply, that for your measurement, are you going to use one of the quantitative scales of measurement (ordinal, nominal, ratio, etc.) mentioned in our text? I am struggling to understand all of the information from our text, so I am really curious to see what others are deciding to use for measurements. I chose an interval scale, but I am uncertain if I chose correctly.

    Kind regards,

    Lori Hill

    Hello, Stefanie.

    Curiously, how are you going to measure consistent classroom management? Do you have a rubric or will this be in an interview process? I ask because I am trying to figure out how to exactly measure if/how teachers use standards-based grading? It was something I have been thinking about lately, and I wondered if you had thought of this also.

    Kind regards,

    Lori Hill

    • Mindy Volk May 26, 2018 at 7:23 am

      Hi, Lori!
      I probably would look at classroom write-ups or referrals as a measurement of classroom management; however, I find that to not be as accurate as I would prefer simply because some teachers ignore certain misbehaviors, and some teachers simply have more patience and do not issue as many write-ups or referrals.

  5. Melissa Jolley May 25, 2018 at 9:24 pm

    Question: How has community structures/educational structures influenced the education of minority populations?
    1. Examine the achievement scores (DV) of minority population within 2 different community settings (IV) diverse schools and racially imbalanced schools.

    Population: High school students (9th-12th grade enrollment).
    Sample: One school sample will be taken from Metro schools with diverse student population and another school that has a low diversity scale. Samples will also be taken from Sumner County school district, considered a suburban school district.
    Measurement of IV: Diversity will be measured by percentage. Diverse schools will be classified by schools with 30%-60% minority populations and racial segregated schools will be classified by populations with 90% or higher of one racial category. Racial composition of schools and communities will be access by public data bases.
    Measurement of DV: The EOC test are already quantitative and available upon request.
    Demographics:
    • school size, locale, school diversity data with SES, gender, race
    • teacher data including diversity, education level, experience level

    Question: How has community structures/educational structures influenced the education of minority populations?
    2. Examine the ratio between minority enrollments in AP/Honor level courses (DV) with minority faculty representation (IV).

    Population: High school students enrolled in AP and Honor level courses (9th-12th grade enrollment).
    Sample: One school sample will be taken from Metro schools with diverse student population and another school that has a low diversity scale. Samples will also be taken from Sumner County school district, considered a suburban school district.
    Measurement of IV: Faculty representation will be classified as White, Black, or Non-White (specifying race or ethnicity when possible) that teach AP/ Honor level courses. (not sure if this data is publicly available).
    Measurement of DV: Minority students enrolled in AP/ Honors courses.
    Demographics:
    • school size, locale, school diversity data with SES, gender, race
    • teacher data including diversity, education level, experience level

    • Chad Lee May 26, 2018 at 11:26 am

      Melissa, are you going to compare each race against each other? Also, there are so many different races that people claim so how will you measure that without offending the subjects? Another component you may want to consider when looking at minorities is the primary language spoke. For example, Hispanics that don’t speak as much English may struggle in school because of the language barrier. Just a few thoughts. Good Luck!

      • Melissa Jolley May 26, 2018 at 12:25 pm

        Hi Chad-
        Race and ethnicity can be tricky, as I am considered an “other.”
        I want to focus on dominant races, as defined by community or school census. I will probably look at subcategories that are available in county school systems. I think that I will most likely be limited to White, Black, Latin, and possibly an other non-white category, but I will not truly know until I start digging.

    • Wes Anderson May 26, 2018 at 8:03 pm

      Melissa,

      You should not have much problem accessing or ensuring validity of any year’s AP test scores, the honor’s tests (likely TNReady) may have a validity concern if you use this past year’s tests. Just food for thought.
      It’s quite obvious you have given extensive thought to the topic and these questions, as well as the accompanying information provided here. Looking forward to your findings!

    • Bryan Upshaw May 27, 2018 at 12:13 pm

      Melissa, I am interested to see what you discover. My school, unfortunately, has very little diversity. On one hand I could see minorities as possibly less likely to get involved since they look different and possibly come from a different cultural background which could make them less likely to excel in AP type classes. Small minorities populations could also be intentionally or unintentionally ostracized or ignored since they are different. On the other hand schools are judged based on the performance of their minorities so I could also see administrators and teachers placing more importance on the small minority population which could actually help them to succeed. I also feel like schools could vary significantly from one school to another. The school culture could make a huge impact on how minorities are treated. It will be a challenge to isolate your independent variable, but I still think these are really important questions to ask and study.

  6. Andrew Alder May 26, 2018 at 11:27 am

    Questions:
    1. Is there a significant correlation regarding successful student academic achievement on the TNReady assessment [DV] and the use of integrated mathematics instruction [IV] compared to regular and honors courses?

    Population: All high school mathematics teachers and their students across Tennessee.

    Sample: A stratified sample of at least 100 teachers from rural, suburban and urban schools of various sizes – and their students.

    Measurement of IV: survey of teachers perceptions on integrated mathematics instruction

    Measurement of DV: The EOC test are already quantitative and available upon request. (The validity and reliability of these test were established in a study by the test authors.)

    Demographics:

    school size, locale, school diversity data with SES, gender, race
    teacher data including diversity, education level, experience level

    2. What is the relationship between academic student success factors including TNReady assessments and instructional format [IV] in regular and honors mathematics high school courses [DV]?

    Population: All high school mathematics teachers and their students across Tennessee.

    Sample: A stratified sample of at least 100 teachers from rural, suburban and urban schools of various sizes – and their students.

    Measurement of IV: survey of teachers and focus groups to find information about the instructional format.

    Measurement of DV: The EOC test are already quantitative and available upon request.

    Demographics:

    school size, locale, school diversity data with SES, gender, race
    teacher data including diversity, education level, experience level

    • Valerie Orfield May 26, 2018 at 1:17 pm

      Hi Andrew,
      What is considered to be an integrated math course? What is “successful student academic achievement on the TNReady assessment? Would this be considered a year’s growth, proficient, advanced? Thanks for sharing and good luck!
      Valerie

    • Michelle Hope May 26, 2018 at 4:07 pm

      Andrew,

      A worthy topic!

      I’m wondering whether your IV measurement is more qualitative than quantitative. How will you quantify teacher perceptions? How will you access data from rural, suburban, and urban schools? Does your district include those sub groups?

    • Cindy Widner May 27, 2018 at 3:46 pm

      Hi, Andrew, I am leaning toward what Michelle said. I agree that your second question seems more qualitative in nature because if the use of the survey of teachers conceptions. I like your topic. I’m interested in your findings since I work with the gifted and often encourage them to take the AP courses at the high school level.

    • Sarah Anderson May 27, 2018 at 5:51 pm

      How are you going to categorize instructional format?

  7. Chad Lee May 26, 2018 at 11:40 am

    The only thought I have on this topic is how the school decides to put students in general vs honors and if that will affect the success on the achievement test. Also, we will assume that the honors students should do better. With that being said, you may want to simplify the classes you use to eliminate the factor of different courses. I like the topic though. Good luck!

  8. Cindy Widner May 26, 2018 at 1:20 pm

    Quantitative:
    • Did the implementation of RTI2B (IV) have an effect on the number of overall office referrals (DV) in grades K-4?
    o Population: All students K-4 in the state of TN
    o Sample: A cluster sample of at least 700 students from two rural schools of various sizes.
    o Measurement: The ODRs from the past two years are already logged and available upon request.
    o Demographics: School size, school locale, school diversity, data with SES, Gender, Race; teacher data including diversity, educational level, experience level

    • Did the implementation of RTI2B have an effect on student achievement in grades 2-4?
    o Population: All grades 2-4 in the state of TN.
    o Sample: A cluster samples of at least 420 students from two rural schools of various sizes.
    o Measurement: TVAAS scores from 2017 and 2018
     Demographics: School size, school locale, school diversity, data with SES, Gender, Race; teacher data including diversity, educational level, experience level

    • Teresa Kirkland May 26, 2018 at 4:29 pm

      Cindy,
      My topic centers around the use of TN Ready data. One concern I have is that due to all the issues with TN Ready and the adjustments related to TVAAS, will my information be considered a valid comparison. Have you had those same worries?

      • Cindy Widner May 27, 2018 at 3:49 pm

        Yes, Teresa, I have. Especially with all that went on this year. The only thing that has been hopeful is during conversations with the teachers at the schools I’m hoping to use is that there were no serious “hiccups” with the test at these schools since it was all done paper pencil. Our high school, however, had serious technological problems.

    • Jennifer Williams May 27, 2018 at 4:54 pm

      Cindy,
      How are you going to measure your IV, RTI2B in your questions? Side note, I love that you’ve picked a topic that you’ve consistently researched and discussed in our previous classes. You are definitely passionate about this topic.

      • Cindy Widner May 28, 2018 at 10:31 am

        Hi, Jennifer, yes I’m very interested in this topic as student behavior is becoming increasingly more difficult and at times even more disruptive in classrooms. A friend of mine has been privy to the implementation process. I will probably use her notes along with teacher/principle interviews/surveys.

  9. Valerie Orfield May 26, 2018 at 1:42 pm

    Wiki/DP 4- The Quantitative Path
    1) Question: What courses and/or professional development (IV) is most effective for preparing pre-service/ seasoned teachers to facilitate elementary science content (DV)?
    Population: All elementary science teachers and their students across TN
    Sample: A stratified sample of at least 100 tenured and non-tenured teachers in rural, urban, and suburban schools.
    Measurement: Effectiveness of courses and professional development will be measured by teacher’s attitude towards (using a rubric), and ability to facilitate science content as measured by formative and summative assessments.
    Demographics: Teacher years of experience, teacher years of experience teaching science, teacher education level, school size, school location, school demographics served
    2) Question: What are progressive and innovative schools doing (IV) to attain achievement (DV) in inquiry-based science instruction?
    Population: All elementary science teachers and their students across TN
    Sample: A stratified sample of at least 100 tenured and non-tenured teachers in rural, urban, and suburban schools.
    Measurement: Achievement at schools designated as “STEM schools” versus those without designation can be measured by formative and summative assessments.
    Demographics: Teacher years of experience, teacher years of experience teaching science, teacher education level, school size, school location, school demographics served

    • Michelle Hope May 26, 2018 at 4:15 pm

      Valerie,

      As an instructional coach I see a hug difference in how pre-service/novice/veteran teachers perform, respond to coaching and training, and adapt to changes. You might consider choosing one group unless you want to compare the differences. It’s also worth noting that Tennessee is making a huge shift in science next year which will limit your data to the years previous to the shift or to the one year with the new standards.

  10. Cassie Worley May 26, 2018 at 2:02 pm

    Question: Do scores on the ENG I end of course assessment [IV] predict student achievement [DV] in other high school English courses?
    Population: All high school English students across Tennessee.
    Sample: A stratified sample of at least 100 students from rural, suburban and urban schools of various sizes.
    Measurement of IV: Released EOC scores in English I, II, III across the state of Tennessee
    Measurement of DV: Value Added Measures, student grades.
    Demographics:
    – school size, locale, school diversity data with SES, gender, race
    -teacher data including diversity, education level, experience level

    Question: Does a teacher’s instructional approach [IV] impact student scores [DV] on end of course assessments?
    Population: All high school English teachers and their students across Tennessee.
    Sample: A stratified sample of at least 100 teachers from rural, suburban and urban schools of various sizes – and their students.
    Measurement of IV: Instructional Quality Assessment or Intellectual Demand Assignment Protocol (The validity and reliability of these scales were established in a previous study.)
    Measurement of DV: The EOC test are already quantitative and available upon request. (The validity and reliability of these test were established in a study by the test authors.)
    Demographics:
    -school size, locale, school diversity data with SES, gender, race
    -teacher data including diversity, education level, experience level, instructional approaches utilized

    • Michelle Hope May 26, 2018 at 4:18 pm

      Cassie,
      I love the update to your questions! And I cannot wait for this research!

      It looks like, given your measurements, that you are actually measuring whether performance on the Eng I EOC predicts performance on other Eng EOC tests rather than their achievement in the course since you will not be looking at course grades.

      Awesome!

    • Teresa Kirkland May 26, 2018 at 4:26 pm

      Cassie,
      Do you have a list of types of instructional approaches that the teachers will use? How will you measure the approach against outside variables?

    • Karly Stache May 26, 2018 at 5:32 pm

      Cassie this sounds interesting! I was discussing the possibility of finding a correlation between school benchmark and state EOC scores. It would be nice to tell a student “if you do ‘x’ on this test, then you are likely going to do ‘y; on another test.” They wouldn’t be going into a high stakes test blind. Great updates!

      • Karly Stache May 26, 2018 at 5:33 pm

        Sorry.. I meant I was discussing the possibility of finding the correlation b/w benchmarks and EOCs with a colleague. I pressed “post” too quickly!

    • Cindy Widner May 27, 2018 at 4:00 pm

      Hi, Cassie, In looking at your instructional approaches, are you examining the methods in which they present the material and assignments or the type of curriculum used? Or both? Although using both seems too large.

    • Andrew Alder May 28, 2018 at 10:17 am

      Cassie,

      I cannot wait to see the results of this study! I too would love to know if these scores predict the next tests scores. This is an awesome study!!

  11. Michelle Hope May 26, 2018 at 4:03 pm

    Research Question 1: How does changing teacher effectiveness (due to instructional coaching and measured by the district evaluation rubric) [IV] impact student growth and achievement on the NWEA Map assessment [DV]?
    Population: K-8 teachers and their students in urban public schools across the state of Tennessee
    Sample: Sample of K-8 teachers identified as needing and receiving coaching, and who are working in a school participating in a one to one coaching program.
    Measurement IV: Teacher effectiveness will be measured using the district evaluation rubric.
    Measurement DV: Student growth and achievement data will be drawn from the NWEA MAP test growth data.
    Demographics:
    1. District data including size, location, diversity (SES, gender, race)
    2. Teacher data including diversity, education level, experience level

    Research Question 2: Does participation in a structured coaching model [IV] impact a teacher’s observation scores? [DV]

    Population: K-8 teachers and their students in urban public schools across the state of Tennessee
    Sample: Sample of K-8 teachers identified as receiving coaching, and who are working at a school participating in a one to one coaching program.
    Measurement IV: This will be measured by teacher enrollment in a coaching model and/or the presence of a professional growth plan in the teacher’s observation platform.
    Measurement DV: Teacher scores will be measured using the district evaluation rubric.
    Demographics:
    1. District data including size, location, diversity (SES, gender, race)
    2. Teacher data including diversity, education level, experience level

    • Teresa Kirkland May 26, 2018 at 4:31 pm

      Michelle, I am really interested in your second research question. Teacher motivation might be a demographic that you could consider as well. The results might vary if they were forced into the coaching model instead of volunteering.

    • Karly Stache May 26, 2018 at 5:28 pm

      Michelle, are you going to focus on individual components of the TEAM model or an overall comprehensive view? Looking forward to hearing more about your research!

    • Angela Hilbert May 27, 2018 at 3:10 pm

      I am interested in your first question. This past school year was the first year for instructional coaches in my district. I am interested in if the guidance of an instructional coach does lead to overall student growth. For measurement I would look at different pieces of assessment data.

    • Cindy Widner May 27, 2018 at 4:04 pm

      Michelle, I’m very interested in your first question as we began implementing co-teaching and instructional coaching after Christmas at select schools in our county. This upcoming year, we are going full-blown putting our sped teachers back in the classroom full time instead of the aids being in there while the sped teachers are doing intervention. Good luck with your findings.

    • Matthew Smith May 28, 2018 at 9:14 am

      Michelle,

      How are you planning to get the evaluation scores from the teachers? I would think a lot of people wouldn’t want their information released. At least, I don’t think poorly evaluated teachers would want their scores released. Are you wanting to find several at each level (i.e. ineffective, effective, highly effective)? I’m also unfamiliar with the NWEA MAP test, but I really hope that those results are provided back to teachers in a much quicker fashion when compared to TN Ready.
      Best,
      M

  12. Teresa Kirkland May 26, 2018 at 4:19 pm

    QUANTITATIVE QUESTIONS
    1. Did an increase in truancy [IV] in middle school students lead to a decrease in student achievement [DV] on the 2018 TNREADY Mathematics assessment?
    Population: All middle school math students in Monroe County, Tennessee
    Sample: A stratified sampling of 100 students from middle school students in Monroe County, Tennessee.
    Measurement: Truancy will be measured by the Average Daily Attendance (ADA) records. TNREADY Mathematics test scores are already quantitative and ready upon request.
    Demographics: School size, school locale, school diversity data with SES, gender, ethnicity, previous TNREADY Mathematics test scores, previous and current attendance records.

    2. What effect does chronic absenteeism [IV] have on middle school student achievement [DV] on the 2018 TNREADY Mathematics assessment?
    Population: All middle school math students in Monroe County, Tennessee
    Sample: A stratified sampling of 100 students from middle school students in Monroe County, Tennessee.
    Measurement: Chronic Absenteeism will be measured by the Average Daily Attendance (ADA) records. TNREADY Mathematics test scores are already quantitative and ready upon request.
    Demographics: School size, school locale, school diversity data with SES, gender, ethnicity, age, previous TNREADY Mathematics test scores, previous and current attendance records.

    • Karly Stache May 26, 2018 at 5:27 pm

      Would you possibly consider using the 2017 testing data? The only reason I suggest this is because of all the testing issues that occurred this year. Last year’s data may be more valid. But if your district didn’t have any problems, disregard!
      Looks like an interesting topic!

      • Teresa Kirkland May 27, 2018 at 8:24 am

        Karly,
        I was concerned about the same thing. Out county didn’t have issues at the intermediate middle school level, but did at the high school level.

    • Cindy Widner May 27, 2018 at 4:06 pm

      Teresa, we are both wanting to use the 2018 TNReady scores. I’m wondering if they will be released in time for us to use.

  13. Carol Powell May 26, 2018 at 4:48 pm

    Topic: Powerful Practices to Build Academic Success for Veterans in Higher Education

    As Perceived by Veteran’s Coordinators at Colleges in the State of Tennessee

    Quantitative Research Questions:

    What effects do veterans services [IV] have on student veteran persistence [DV] and completion rates [DV]?
    Population: Veterans Coordinators (VCs) at SACSCOC accredited colleges across Tennessee

    Sample: (Pop.) Will survey VCs at all two and four-year accredited colleges in Tennessee (about 55, will verify)

    Measure/Quantify IV/DV: Perceived effectiveness will be measured using the Likert Scale

    Demographics:

    State colleges
    Private colleges
    2-year colleges
    4-year colleges
    Which practices [IV] are most effective for increasing student veteran academic success [DV]?
    Population: Veterans Coordinators (VCs) at SACSCOC accredited colleges across Tennessee

    Sample: (Pop.) Will survey VCs at all two and four-year accredited colleges in Tennessee (about 55, will verify)

    Measure/Quantify IV/DV: Rank order a list of practices/veterans specific services

    Demographics:

    State colleges
    Private colleges
    2-year colleges
    4-year colleges
    By: Carol Lynn Powell
    Date: 05/26/2018 4:21 pm

    • Laura Mason May 27, 2018 at 2:49 pm

      Curious if financial assistance and mentoring will be part of your focus?

    • Angela Hilbert May 27, 2018 at 3:04 pm

      Your study about academic success seem fascinating. I would include in the demographics which branch of the military the student veterans have served in. I would also consider if the veteran has ever been deployed to active duty.

    • Michelle Wilson May 27, 2018 at 3:41 pm

      Carol,
      Did you consider going to VA hospitals or ROTC programs? So you can have more places to gather data from veterans.

    • Matthew Smith May 28, 2018 at 9:18 am

      Carol,

      I think one major thing to include within your demographics might be age unless the current program currently has age restrictions. It would be interesting to try and find how age, “grit”, and veteran coordinator led activities impact veterans’ completion rate at TCAT/certification areas, 2 year degree, and 4 year degree marks.
      Best, M

      • Carol Powell May 28, 2018 at 11:33 am

        I had not previously considered including the TCATs, but now I think that I will take another look at my population and sample to see if that might be appropriate for my research study. Thanks.

  14. Karly Stache May 26, 2018 at 5:23 pm

    Question #1: How does standard based feedback (IV) impact student achievement on
    summative benchmarks (DV) in 6th-9th grade math?

    Population: All 6th-9th mathematics teachers and their students across Tennessee.

    Sample: A purposive sampling of 6 classes of about 25 students per grade level (approximately 600 students) in 6th-9th grade math classes in Blount County School District

    Measurement of IV: The benchmarking platform, MasteryConnect. Students will take formative assessments on the platform and receive individual standard feedback, as well as level of proficiency (mastery, near mastery, remediation)

    Measurement of DV: MasteryConnect (similar to IV). Students will take a summative benchmark at the end of the quarter over all content learned thus far and receive single standard feedback.

    Demographics: school size, locale, school diversity data with SES, gender, race;
    teacher data including number of years teaching current content level, education level, experience level

    Question #1: How does the implementation of power standards (IV) impact student achievement
    on summative benchmark assessments (DV) in 6th-9th grade math

    Population: All 6th-9th mathematics teachers and their students across Tennessee.

    Sample: A purposive sampling of 6 classes of about 25 students per grade level (approximately 600 students) in 6th-9th grade math classes in Blount County School District

    Measurement of IV: MasteryConnect. Teachers will give multiple, single-standard assessments on the power standards for their grade level using the platform and students will know their proficiency level on each power standard for their grade.

    Measurement of DV: MasteryConnect (similar to IV). Students will take a summative benchmark at the end of the quarter over all content learned thus far and receive single standard feedback.

    Demographics: school size, locale, school diversity data with SES, gender, race
    teacher data including number of years teaching current content level, education level, experience level

    • Wes Anderson May 26, 2018 at 7:45 pm

      Karly,
      You are measuring how the power standards will affect achievement, how will you gauge (or are your planning on gauging) their effectiveness as compared to traditional standards?

    • Michelle Wilson May 27, 2018 at 3:38 pm

      Karly,
      Could you use MAP information in your study too? Use their percentile rank or RIT score compared to the others at the same age and level. Just a thought!

    • Sarah Anderson May 27, 2018 at 5:53 pm

      What are you defining as 9th grade math and are you planning to account for advanced students who are taking subjects before the standard level?

    • Matthew Smith May 28, 2018 at 9:20 am

      Karly,

      What protocol are you going to use to ensure that the assessments are all graded in the appropriate manner? 600 students is a large sampling when having all of this data collected and with all of this grading needed for these assessments. Does MasteryConnect provide the assessments? Are the assessments graded online through MasteryConnect? If that’s the case, your data collection might be a lot easier to collect.
      Best,
      M

  15. Wes Anderson May 26, 2018 at 7:37 pm

    Question 1: What effect does Integrated Math Curriculum and its devotion to connecting topics (IV) have on student knowledge retention (DV) from grade to grade?

    Population: All 10th and 11th grade math students at Tennessee high schools that utilize Integrated Math Curriculums

    Sample: Stratified sample of 100 sophomore/junior math students from a school that utilizes integrated math curriculums

    Measurement: TNReady and EOC scores – already quantitative in nature – from previous school year and current. The TNReady contains strands of questions that are pulled from previous course standards that could be utilized for basis of knowledge retention.

    Demographics: School size, genders, racial diversity and composition, location of school

    Question 2: How do the ACT scores (DV) compare within the realms of traditional curriculum and Integrated Math Curriculum?

    Population: All sophomore/junior math students at Tennessee high schools

    Sample: Stratified sample of 100 sophomore/junior math students from both traditional and integrated math curriculums

    Measurement: ACT scores and TNReady scores will be utilized and analyzed for comparison.

    Demographics: School size, genders, racial diversity and composition, location of school

    • lauralynnroland May 27, 2018 at 4:33 pm

      Wes,

      I am curious whether Integrated Math is a high school only curriculum? Hate to show my clueless here :/ If it isn’t, would students who had the integrated math longer have a greater advantage over students who had not?

      • Wes Anderson May 28, 2018 at 8:15 pm

        Laura,

        Pre-high school math is basically the integrated curriculum as there is geometry and algebra and other topics rolled into one course. High school courses are generally algebra 1, geometry, then algebra 2. Integrated Curriculum would just continue rolling those into one course.

        I am researching and wondering if integrated curriculum helps students perform better on ACT and/or understand more about mathematics as a whole rather than just a course grade.

  16. Rachel Hicks May 26, 2018 at 9:12 pm

    Quantitative Question 1: What is the relationship between student engagement in a middle school classroom (IV) and academic achievement on benchmark assessments (DV)?

    Population: All middle school students, grades 6-8.

    Sample: A stratified sample of at least 100 students from grades 6-8.

    Measurement of IV: The student engagement will be quantified by using the Behavioral Observation of Students in Schools (BOSS) method.

    Measurement of DV: STAR Reading and iReady math diagnostic are the assessments which will be used. These benchmark assessments are given three times a year in August, December, and March.

    Demographics: The demographic information which would affect this study would be the school size, location, economic diversity, gender diversity, ethnic diversity of the students. Demographic information regarding teacher experience and teacher education level will also be noted.

    Quantitative Question 2: Does level of engagement-focused pedagogy (IV) correlate with number of discipline referrals (DV) in a given time?

    Population: All middle school students, grades 6-8.

    Sample: A stratified sample of at least 100 students from grades 6-8.

    Measurement of IV: The student engagement will be quantified by using the Behavioral Observation of Students in Schools (BOSS) method.

    Measurement of DV: The number of discipline referrals a teacher writes during a nine week grading period.

    Demographics: The demographic information which would affect this study would be the school size, location, economic diversity, gender diversity, ethnic diversity of the students. Demographic information regarding teacher experience and teacher education level will also be noted.

    • Jennifer Adkins May 27, 2018 at 12:15 am

      Rachel,
      I love the specifics you provide in your measurements of independent and dependent variables. How do you plan to organize your data in terms of grade level and student achievement and growth? Are you planning to pull a specific number of students from each grade level?

    • Jennifer Williams May 27, 2018 at 4:48 pm

      Rachel,
      I am unfamiliar with the Boss method and had to research it since I had been searching for an assessment to measure social/emotional issues for my questions. Have you used the Boss before? Pearson stated that it is retired but I found other information that makes me think it would be a useful tool. Just curious…

  17. Alvin Sandres May 26, 2018 at 10:29 pm

    Quantitative Questions
    1. Does student involvement of extracurricular activities increase student graduation rate?

    2. What impact does student involvement in extracurricular activities have on the percentage of students who meet ACT benchmarks?

    Population – high school students in the state of Tennessee

    Samples – The seniors classes from the two high schools in Hamblen County. The two high schools have an estimated 1600 students.

    Measurements- Study of school and district report on graduation rates, Act benchmark reporting compared to percentage of students involved in extracurricular

    Demographics- Gender, family structure (two parent homes vs single parent homes), Socioeconomic

    • Jennifer Adkins May 26, 2018 at 11:55 pm

      Alvin,
      In terms of your samples, I would be concerned that it is too large of a sample of the population. It might be a bit more manageable to take a sampling of 50 to 100 students from each senior class. Otherwise, your measurements and demographics are very straight forward and easy to follow.

    • Mindy Volk May 27, 2018 at 7:17 am

      Alvin,
      I feel like our school (Jefferson County High School) focuses heavily on the ACT more than anything else, so your research is enticing to me. I know that West and East are similar in many ways, but do they pretty much have the same offerings as extracurricular activities as well? If one school offers many more activities than the other, that may make a difference in the comparison.

    • Bryan Upshaw May 27, 2018 at 11:55 am

      Your study would be fascinating but I agree with some of the other comments that it would be a huge undertaking to collect data on 1600 students. Some such as the ACT score the school already collects but you would have to align the data with students who do extra curricular I would assume. This could be difficult.

    • Angela Hilbert May 27, 2018 at 2:59 pm

      Your study is interesting since most high school seniors have graduated. I know in my school district the senior class in both our high schools is large. I would recommend including no more than 200 from both high schools. I would also look at which specific extracurricular activities increase the graduation rate

      • Misty Meadows May 27, 2018 at 4:05 pm

        Hi, Alvin. I am very interested in your study; I currently work at a high school. I would be interested to see your data. What is considered an extra-curricular activity? Would it need to be an after school activity or could it be an in school club option?

    • Cindy Widner May 27, 2018 at 4:14 pm

      Hey, Alvin, I am wondering if that isn’t too large of a population. I’m looking at about 600 students from K-4 in two different schools and I’m feeling overwhelmed. Maybe you could randomly choose no more than 400 students. That is still going to be an overwhelming amount of data to sort through.

    • Andrew Alder May 28, 2018 at 10:21 am

      Alvin,
      I look forward to these results. Do both high schools have a total of 1600 seniors or is that total students? If that is total seniors, that is a lot of data to have to sort through. Just wondering!! Good luck!

  18. Jessica Ordonez May 27, 2018 at 12:03 am

    Question: How does participation in OST STEM programs (IV) affect standardized test scores (DV)?
    Population: Third and fourth grader, public-school students in Tennessee
    Sample: A cluster sampling of students enrolled in OST STEM programs in Oak Ridge, TN.
    Measurement of IV: Participation is defined as attending an afterschool program that is self-described as a STEM program
    Measurement of DV: A comparison of standardized test scores to previous years’ scores
    Demographics: gender, reading level, SES, race, ethnicity, program cost, program quality, teacher competence

    Question: How does OST STEM program quality (IV) affect student outcomes (DV)?
    Population: Third and fourth grade, public-school students in Tennessee
    Sample: A cluster sampling of students enrolled in OST STEM programs in Oak Ridge, TN.
    Measurement of IV: OST STEM program quality will be measured by evaluating: interactions between staff and students, positive relationships between students, skills taught, appropriate academic level, mastery orientation.
    Measurement of DV: Data on student school attendance, benchmark testing, and behavioral data
    Demographics: gender, reading level, SES, race, ethnicity, teacher competence, level of student participation

    • Jennifer Adkins May 27, 2018 at 12:23 am

      Jessica,
      How many students do you plan to sample within the population? Will there be an equal amount from each grade? Great work on your demographics and measurements!

      • Jessica Ordonez May 27, 2018 at 8:09 pm

        Jennifer, thanks for catching that. I could just put a number of 100. But, I honestly do not know how many total students are even enrolled in OST STEM programs in Oak Ridge. I have emailed and called the four other institutions that offer OST STEM programs.

    • Michelle Wilson May 27, 2018 at 3:34 pm

      I like the STEM question. It is the latest instructional idea that is spreading through all educational programs. Did you consider looking at the information about students during the first science and math push in education during the time of The Race to the Moon? Maybe looking at why the math and science push stopped and now was picked back up.

      • Jessica Ordonez May 27, 2018 at 8:12 pm

        Michelle,
        What a wonderful idea! I have so many questions about OST STEM projects my problem is settling on one 🙂

    • Alvin Sanders May 27, 2018 at 4:42 pm

      Jessica,
      I have the same question as Jennifer. I think you may want to have a target number for sample size.

  19. Jennifer Adkins May 27, 2018 at 12:36 am

    Quantitative Question 1:
    What is the impact of technology use [IV] on student achievement [DV] in the Humanities classroom?
    Population: All high school students at a public suburban high school.
    Sample: A stratified sample of at least 100 students ranging in grades 9-12. An equal sampling will be taken from each grade level and will be done through the Humanities classes.
    Measurement of IV: Technology use will be measured through teacher input and feedback on current technologies utilized in the classrooms.
    Measurement of DV: Student achievement will be measured via benchmark testing for growth in specific standards related to the Humanities classroom in question. The classroom will vary based on students being assessed and specific teachers participating in the study.
    Demographics: The demographic information that would affect this study would be the students socio-economic statuses, ethnic diversity, educational background, school size/location, and county or district policies.

    Quantitative Question 2:
    What is the impact of technology use [IV] on students’ 21st Century skills [DV]?
    Population: All high school students at a public suburban high school.
    Sample: A stratified sample of at least 100 students ranging in grades 9-12. An equal sampling will be taken from each grade level and will be done through the Humanities classes.
    Measurement of IV: Technology use will be measured through teacher input and feedback on current technologies utilized in the classrooms.
    Measurement of DV: Students’ 21st Century skills will be measured via skills testing and student feedback.
    Demographics: The demographic information that would affect this study would be the students socio-economic statuses, ethnic diversity, educational background, school size/location, and county or district policies.

    • Mindy Volk May 27, 2018 at 7:20 am

      Hey Jennifer,
      What type of technology is available? I am curious if you will be looking specifically at laptops, iPads, etc., or if you are looking at technology use as a whole. (Our school implemented 1:1 this past year). Excited to see what you discover!

    • Misty Meadows May 27, 2018 at 4:00 pm

      Hi, Jennifer. Cool idea. I like that you are going to utilize the humanities area to do the surveys or gather data. This gets a wide range of students. What type of funding does this district have for technology in the classroom?

    • Donna Wineland May 27, 2018 at 4:17 pm

      Jennifer,
      Your first question is so important to me as an English teacher. I’m always trying to balance time on laptops with other activities and modes of learning. Will your benchmark testing be teacher-created or state/district?

    • Alvin Sanders May 27, 2018 at 4:47 pm

      Jennifer,
      Are you researching technology uses by students, teachers or both as it impact achievement? During our study block time, I have notice the increase of apps students are relying on instead of teacher notes.

  20. Clint Epley May 27, 2018 at 11:22 am

    1. Do Chromebooks (IV) increase student achievement (DV)?

    Population: School districts across the United States.

    Samples: At least 100 employees from a small (seven school) southern middle Tennessee 1:1 school district.

    Measurement of IV: Survey of number of Chromebooks in the system as well as information from teachers regarding time spent on devices daily.

    Measurement of DV: TNReady and ACT score reports will be observed from the 2014 – 2015 school years to the 2017 – 2018 school years.

    Demographics:School sizes, teacher experience, gender, SES, teacher education levels, technology department employees, number of students, internet access in the community

    2. Do 1:1 technological professional development opportunities (IV) increase teacher utilization (DV) of devices in the classroom?

    Population: School districts across the United States

    Samples: At least 100 employees from a small (seven schools) 1:1 school district in southern middle Tennessee.

    Measurement of IV: Survey of number of professional development (PD) sessions offered annually over a four year period (2014 – 2015 to 2017 – 2018). Any increases or decreases in PD sessions offered will also be noteworthy.

    Measurement of DV: Survey detailing percentage increased or decreased regarding teacher usage of devices post professional development. Increases and decreases in teacher utilization of devices over the same four year period will also be noteworthy.

    Demographics: School sizes, teacher experience, gender, SES, teacher education levels, technology department employees, number of students, internet access in the community

    • Donna Wineland May 27, 2018 at 4:14 pm

      Clint,
      Your first question is one I think we’d all like to have answered. I’m curious if you’ll include the reliability (and lack thereof) for the quickly outdated devices. There just seems to be too many variables to consider in the scope of this question; I am anxious to see how the correlations to success will play out. Good luck!

    • Cindy Widner May 27, 2018 at 4:22 pm

      Clint, does your research look at all grades or specific ones? Also, in 2014-2015, had the Chromebooks been used at this time? If not, that would seem to be some good solid data to compare.

      • Clint Epley May 28, 2018 at 3:04 pm

        2014-2015 was the year 1:1 (chromebooks were the choice) were implemented in grades 3-8 and 9th – 12th the following year. So four years of implementation have nearly passed. The devices themselves haven’t changed but maintaining them / insuring them has been an issue. Furthermore, I’m not sure they provide any real value to student achievement – so new that research hasn’t been done, at least to my knowledge. Data from previous years may be hard to come by but in all honesty my dissertation will likely be qualitative.

        Thanks for the feedback!

        Clint

  21. Bryan Upshaw May 27, 2018 at 11:52 am

    Question 1:
    What effect does video conferencing (IV) have on oral communication (DV) in the foreign language classroom.

    Population: All high school foreign language students (I could narrow it down to all high school Spanish foreign language students if needed).

    Sample: a stratified or convenience sample of 20-30 students

    Measurement: Oral assessment of 25 questions

    Demographics:

    My school is around 800 students, 96 percent caucasian a mix of suburban and rural, It is not a title 1 school, low-income is 34%, 52 % female and 48% male, teachers are mostly caucasian. Around 5% is mixed race.

    Question 2:
    What effect does video conferencing (IV) have on written understanding of a foreign language (DV).

    Population: All high school foreign language students (I could narrow it down to all Spanish foreign language students if needed).

    Sample: a stratified or convenience sample of 20 students

    Measurement: Written multiple choice assessment of 25 questions

    Demographics:

    My school is around 800 students, 96 percent caucasian a mix of suburban and rural, It is not a title 1 school, low-income is 34%, 52 % female and 48% male, teachers are mostly caucasian. Around 5% is mixed race.

    • Jennifer Williams May 27, 2018 at 4:38 pm

      Bryan,
      I’m curious on how you’ll measure the IV in both of your questions? Will you compare students who participate in the video conference do those who do not?

    • Susan Dalton May 27, 2018 at 6:19 pm

      Bryan,
      I assume this type of video conferencing is already happening and there is a comparison group to measure against. 20 students seems like a small sample size to account for the population you are targeting. There are many factors that may determine how students learn in both groups. I would be interested to know how you plan to account for these.

  22. Laura Mason May 27, 2018 at 2:48 pm

    Question: Does an increase in the use of technology (IV) among middle school students increase their scores on TN Ready essay writing subtests (DV)?

    Population: Middle school students of a virtual school serving students across Tennessee.

    Sample: A stratified sample of at least 100 middle school students.

    Measurement of IV: Students are provided the same laptop with identical software and access to same online platforms. Time spent using technology will be calculated within the program(s) utilized.

    Measurement of DV: The TN Ready essay writing subtests are already quantitative and available upon request. (The validity and reliability of these test were established in a study by the test authors.)

    Demographics: The school is a K-8 virtual school with 1,300 students. The school serves multicultural male and female students across the state of Tennessee from diverse home learning environments.

    Question: What effect does the use of technology (IV) among middle school students have on writing interventions (DV)?

    Population: Middle school students of a virtual school serving students across Tennessee.

    Sample: A stratified sample of at least 100 middle school students in RTI.

    Measurement of IV: Students are provided the same laptop with identical software and access to same online platforms. Time spent using technology will be calculated within the program(s) utilized.

    Measurement of DV: Fidelity checks and progress monitoring of specific writing skills will be collected by trained RTI specialists.

    Demographics: The school is a K-8 virtual school with 1,300 students. The school serves multicultural male and female students across the state of Tennessee from diverse home learning environments.

    • Clint Epley May 27, 2018 at 3:49 pm

      Laura,

      Your topic is very similar to mine. Mine’s more along the lines of how to successfully implement 1:1 technology in a school system. Successfully includes student achievement. My system has been 1:1 for the previous four school years. I often wonder if its had a positive effect regarding student achievement. Will you look into teacher usage of the devices ? I know some teachers who use ours all the time while others hardly ever look at them (which isn’t a problem).

      Interesting and very important topic as districts spend time and money on devices.

      Clint

      • Misty Meadows May 27, 2018 at 4:08 pm

        Hi, Laura. Interesting topic. I think it would be interesting to not how much money your district spends per student on technology and if the students have computer and internet accessibility at home.

  23. Susan Dalton May 27, 2018 at 2:48 pm

    Question 1:
    How much time is saved for school personnel when enrollments are completed at a centralized intake facility?
    Population: The population would be school enrollment personnel and ELL teachers.

    Sample: six Family/Community Liaisons that complete enrollments at the centralized intake facility and one ELL teacher that completes all the screenings

    Measurement: Self-reported time cards to show the time the the enrollment started and ended

    Demographics: The Welcome Center (centralized intake facility) enrolled 109 students and screened 82 students in a six month time period. 71 students enrolled in school in the United States for the first time. All of these students had a different native language than English with Spanish being the most prominent. These students enrolled in 42 different schools in K-12th grade.

    Question 2: Do Spanish-speaking parents feel more satisfied/confident with enrolling their children in a school system when they complete the enrollment at a centralized intake facility or at an individual school?
    Population: All Spanish-speaking parents who enroll their child(ren) in school.

    Sample: 50 Spanish speaking families that enroll at a centralized intake facility and 50 families that enroll in a school building.

    Measurement: A Likert-scale survey that parents answer after completing the enrollment process

    Demographics: The county that this study takes place serves about 3,600 ELL students. Of these students, Spanish is the most prominent language. The central intake facility enrolled 109 students in a 6 month time period. Two elementary, two middle and two high schools would be used to compare parent thoughts to those at the intake facility. These schools would be located on the other side of the county where it would be less likely for parents to opt to drive to the centralized enrollment facility.

    • Jordan Reed May 31, 2018 at 5:27 pm

      This is an interesting topic. My inclination is to say that surely having a central facility complete enrollments will save time for everyone. However, there are some negative aspects that arise, too, such as the reluctance for families to go through such a process due to the potential to feel alienated from their child’s school. If I were enrolling a child, I’d like to do it at a school so I could maybe go on a tour, get to meet staff and teachers and such, but looking at it from a top level perspective it would probably save a huge amount of time to have the enrollment paperwork completed offsite and parents just show up to school with their child ready to go.

  24. Angela Hilbert May 27, 2018 at 2:52 pm

    The Quantitative Path- Sampling Descriptive Statistics & Inferential Stats

    Question 1: How many students with mild disabilities are proficient or advanced in literacy on the TN Ready (DV) when taught in an inclusive classroom(IV)?
    Population: All elementary and middle school students with mild disabilities attending Washington County Tennessee Schools.
    Sample: A cluster sample of at least 100 students with mild disabilities from elementary and middle schools in Washington County Tennessee Schools
    Measurement: Data from the 2017-2018 TN Ready Literacy is already quantitative and is available on the district and school report card on the Tennessee Department of Education’s website.
    Demographics: School size, school locale, general education class size, special education teachers’ case load, gender; Teacher data including experience level and education level.

    Question 2: What is the percentage of proficient and advanced in literacy students with disabilities (DV) compared to students without disabilities(IV)?
    Population: All elementary and middle school students attending Washington County Tennessee Schools.
    Sample: A purposive sampling of students without disabilities and students with disabilities in Washington County Tennessee Schools.
    Measurement: Both the TN Ready Literacy Assessment and STAR 360 Reading Assessment will be used to analyze the variable of this question.
    Demographics: School size, school locale, general education class size, special education teachers’ case load, gender; Teacher data including experience level and education level.

    • Jennifer Williams May 27, 2018 at 4:34 pm

      Angela,
      How will you determine mild disability? Would it be easier to list the specific eligibility criteria you will target in your population? example SLD in reading

    • Susan Dalton May 27, 2018 at 6:03 pm

      Angela,
      At one point I had thought about a topic that would use TN Ready scores, but to be quite honest, I was worried about them being valid with the ongoing issues that have repeatedly happening with the testing companies and platform. Those would be factors that you would need to be aware of and at least point out in your study should those problems arise for the testing cycle you wish to use.

  25. Michelle Wilson May 27, 2018 at 3:30 pm

    Topic: Job satisfaction of teachers in urban school
    Quantitative Research Questions:
    1. Do teachers in urban schools that identify with an “I think I can” mentality [IV] have higher achievement scores in their classes [DV]?
    (Population) Teachers in urban schools that teach content classes with a state test and have a “I think I can” mentality.
    (Sample) Teachers in an urban school of any grade K-12 that have taught more than 3 years and have a tested subject.
    (Measure DV) Using MAP results of students under specific teachers at the 40th percentile or higher in Reading and Math. Also, TN Ready results of students on Level 3 (On-track) or Level 4 (Mastered) in Reading, Math, Social Studies, or Science)
    (Measure IV) Survey of questions that ask if the teacher has a mentality of “I think I can.”
    (Demographics) Teachers experience level, present teaching position, present teacher location, teacher test scores, Teacher tripod data (student perception information about teacher should be on a level 3 at least), survey information to help identify teachers with “I think I can mentality,”

    2. Do teachers with more self-efficacy in urban school [IV] have higher achievement scores in their classes [DV]?
    (Population) Teachers with strong self-efficacy level that teach tested subjects in urban schools.
    (Sample) Teachers in an urban K-12 school that have taught more than 3 years and have a tested subject.
    (Measure DV) Using MAP results of students under specific teachers at the 40th percentile or higher in Reading and Math. Also, TN Ready results of students on Level 3 (On-track) or Level 4 (Mastered) in Reading, Math, Social Studies, or Science)
    (Measure IV) survey that allows to rate self-efficacy of a teacher, determine teachers that have more self-efficacy using a checklist procedure
    (Demographics) Teachers experience level, present teaching position, present teacher location, teacher test scores, Teacher tripod data (student perception information about teacher should be on a level 3 at least), survey information to help identify teachers with more self-efficacy

    • lauralynnroland May 27, 2018 at 4:27 pm

      Hi Michelle,

      I find your topic very interesting. I was curious, because I was pondering the survey option as well, what were your thoughts concerning the checklist procedure? Is it a checklist that is already in use or something you would have to develop? You always have good thoughts 🙂

    • Susan Dalton May 27, 2018 at 6:09 pm

      Michelle,
      I, too, am looking at the possibility of using a survey, but I’m struggling to determine if it is a valid and/or reliable means of measurement. I’m not sure how to make that happen if this study is the first time it is used to collect data.

  26. Misty Meadows May 27, 2018 at 3:57 pm

    1. What effect does elementary drug prevention programs (IV) have on high school student drug use (DV)? 

    Population – 9-12 students in the Dickson County School District

    Sample – 500 total students gathered from both high schools in the district

    Measurement of IV – Survey district leaders to determine types of prevention programs that are taught, how frequently, and age of students when programs are introduced.

    Measurement of DV – Survey students to determine if peer pressure or outside factors cause drug use, or if they use tactics learned in drug prevention programs to abstain from poor choices.

    Demographics – school population, age that the student participates in the program, length of the program, curriculum covered during program, percentage of attendance in program

    2. What outside factors (IV) influence drug use in teens (DV)?  

    Population – 9-12 students in the Dickson County School District

    Sample – 500 total students gathered from both high schools in the district

    Measurement of IV – Use school information systems to determine SES, Ethnicity, SWD

    Measurement of DV – Survey students to determine frequency and type of drug use

    Demographics – home environment, socio-economic status, parent/guardian awareness, race/ethnicity, SWD subgroup, peer group, public or private school, rural vs. urban school districts

    • Jessica Ordonez May 27, 2018 at 8:39 pm

      Misty,
      A great study and I will be interested to see your results. How reliable do you think a questionnaire about drug use will be?

    • Janie Evans May 27, 2018 at 9:54 pm

      Misty, I like how specific you are in your responses. The use of surveys is also interesting.

    • Julia Wenzel-Huguley May 27, 2018 at 10:34 pm

      Misty- I think you have chosen a really interesting topic. I think this would be interesting to complete in several different districts of different sizes and demographics to see if the results compare. I think the others that have commented have raised some interesting questions, and I would agree with them in terms of honesty from the students that are reporting. I think it would be difficult to get students to respond to such a survey, do you have any plans in place to motivate students to participate? Maybe it could be connected to a school-wide something, or have some sort of incentive- the most percentage of a grade-level can get a reward of some sort?

  27. Donna Wineland May 27, 2018 at 4:10 pm

    Quantitative Questions
    Question 1: What effect does differentiated instruction (IV) have on eleventh graders (DV)?

    Population: High school juniors in Tennessee

    Sample: A stratified sample of at least 100 students of varying skill and performance levels

    Measurement of IV: Surveys to teachers regarding frequency and competency in DI

    Measurement of DV: Teacher-created formative and summative assessments in English and select Social Studies courses.

    Demographics: College preparatory independent school of approximately 500 students of mixed skill levels, 10-15% ethnic diversity, middle to high socioeconomic status. Teacher demographics to include years of experience, years of education, and perhaps years of service in current position.

    Question 2: What strategies of differentiation (IV) work most efficiently in each section of junior-level humanities classes (DV)?

    Population: College Preparation (CP) or Advanced Placement (AP) juniors in Tennessee

    Sample: A stratified sample of at least 100 students of varying skill and performance levels

    Measurement of IV: Surveys to teachers regarding most effective forms of DI

    Measurement of DV: Teacher-created formative and summative assessments in English III, AP Language, and select Social Studies courses.

    Demographics: College preparatory independent school of approximately 500 students of mixed skill levels, 10-15% ethnic diversity, middle to high socioeconomic status.

    • Jennifer Williams May 27, 2018 at 4:28 pm

      Hi Donna,
      Are you going to choose certain differentiation strategies to focus on? I have differentiation as a focus of one of my questions. I was thinking there are so many different strategies, would it be possible?

    • Sarah Anderson May 27, 2018 at 5:46 pm

      Donna,
      How do you plan to measure differentiated instruction?

    • Jessica Ordonez May 27, 2018 at 9:00 pm

      Donna,
      Will the teacher collaborate to create one standard set of formative and summative assessments or will each classroom/teacher create their own?

    • Kelsey Walker May 27, 2018 at 11:02 pm

      Donna,
      I am wondering if you could use the results of the AP exam as your DV Measurement tool in your second question since those results would already be established quantitative results that could be accessed fairly easily, as long as you were not looking at specific individuals’ data. I think the broad spectrum of strategies that teachers use to differentiate and assess may make your measurements difficult to pin down as they currently are stated. I am very interested in this!

  28. lauralynnroland May 27, 2018 at 4:22 pm

    a) Does the number of graduates (male and female) (DV) who continue in a post-secondary STEM field increase after the implementation of the STEM program (IV)?
    Population: Graduates who have gone through a high school STEM program
    Sample: Graduates of Central Magnet School who have gone through the STEM program
    Measurement of IV: AdvancED STEM Standards and Indicators and ELEOT(these standards and indicators have been quantified; the ELEOT has been quantified by being utilized in almost 90,000 classrooms)
    Measurement of DV: Percentage of graduates who continued in a post-secondary field
    Demographics: Student data: gender, race, data with SES, school STEM program offerings

    b) Does the STEM program (IV) improve ACT scores (DV), specifically in the areas of Math and Reading?
    Population: Students who participate in a STEM program
    Sample: 10th, 11th, 12th grade students of Central Magnet School who participate in the STEM program and have completed the ACT.
    Measurement of IV: AdvancED STEM Standards and Indicators and ELEOT(these standards and indicators have been quantified; the ELEOT has been quantified by being utilized in almost 90,000 classrooms)
    Measurement of DV: The ACT is already quantitative. 10th and 11th grade ACT scores will be used before students enter the STEM program.
    Demographics: gender, race, data with SES, school STEM program offerings

    • Sarah Anderson May 27, 2018 at 5:48 pm

      Do you plan to differentiate between the different levels of students when analyzing them based on what specific classes they have taken?

    • Jessica Ordonez May 27, 2018 at 8:55 pm

      Are there high school students who have been in an AdvancEd high school for four years? Or are you looking for data from students who have spent any amount of time in a STEM certified high school? In TN the first schools were certified in 2016.

    • Melissa Jolley May 27, 2018 at 9:17 pm

      FANTASTIC TOPIC!
      I would really like to know how STEM initiatives are really impacting the achievements of students or if it is really providing meaningful career paths for students after they graduate.
      I would also like to know how it is impacting girls. I know that traditionally, there are less girls in these fields. Perhaps looking into ratios in gender. I would hypothesize that girls would be draw to STEM fields that are helpful or serve a broader purpose. It would be interesting to qualify those fields that have a higher ratio of girls.

      To your second question, what STEM provides is more context for math and science and I would want to know if this allows students to dig deeper. When they contextualize their understanding of concepts, does this lead to higher achievement scores. This leads us to more fundamental questions about whether or not all the shifts in standards is worth it- does contextualized learning affect achievement?

    • Janie Evans May 27, 2018 at 10:00 pm

      Interesting topic. It will be especially interesting to see if STEM improves ACT scores.

  29. Jennifer Williams May 27, 2018 at 4:23 pm

    Question 1: What is the effect of differentiated instruction strategies in general education on the social emotional well-being of students who are gifted?

    Population: K-8 students who are identified as gifted in Tennessee
    Sample: Cluster Sample of identified gifted students in K-8 schools in Chattanooga
    Measurement of IV: Direct Observation of teacher using targeted differentiation strategies
    Measurement of DV: DESSA (The Devereux Student Strengths Assessment) Given prior to and after a determined period of time
    Demographics:
    school size, number of students identified gifted per school and grade, number of eligibility years per student, years’ experience of teachers along with endorsement areas, IEP services received

    Question 2: What is the relationship between direct service time and the attitudes of students who are gifted?

    Population: K-8 students who are identified as gifted in Tennessee
    Sample: Stratified sample from K-8 gifted students at my school
    Measurement of IV: Direct Observation review of electronic Individualized Education Program listing direct services
    Measurement of DV: Bipolar Adjective Scale
    Demographics: # of identified gifted students in each grade, gender, # of years of eligibility, Curriculum used in direct instruction, experience of direct service teachers, Other IEP services

    • lauralynnroland May 27, 2018 at 4:43 pm

      Hi Jennifer,

      I really like your first question. I notice that you use the word “targeted” for the IV measurement: are you thinking two strategies or more? Also, how are the students being taught while the observation is taking place (i.e. normal class time, a special gifted/enrichment class)? A really interesting topic.

    • Janie Evans May 27, 2018 at 9:51 pm

      Jennifer I am especially interested in how you will measure differentiated instruction. Direct observation may be something that could work for me with RTI interventions.

    • Julia Wenzel-Huguley May 27, 2018 at 10:31 pm

      Jennifer- Just wondering- do you have access to K-8 students in your school? I wonder if it would be worth looking at another district that doesn’t have the same grade set up and see if they have similar data. I remember a discussion that the fewer building changes the better, I wonder if that could be a factor here as well. It may be something to consider the number of students per case load and if the teacher is only working with gifted or is responsible for other students with an IEP as well.

  30. Sarah Anderson May 27, 2018 at 5:44 pm

    1. Question: Does standards based teaching in a secondary mathematics class increase student test scores on state assessments or ACT/SAT tests?
    Population: 7-12 grade mathematics students in my school district.
    Sample: A random sample of at least 100 students for each tested subject or level spread out through the different schools in the county from both teachers teaching by standards based practices and those not.
    Measurement of IV: A survey of teaching practices to determine standards based or not.
    Measurement of DV: The EOC tests are already quantitative and available upon request. (The validity and reliability of these tests were established in a study by the test authors.)
    Demographics:
    • school size, locale, school diversity data with SES, gender, race
    • teacher data including diversity, education level, experience level
    2. Questions: Does standards based teaching in secondary mathematics classes increase the fluency of the topics?
    Population: 7-12 grade mathematics students in my school district.
    Sample: A random sample of at least 100 students for each tested subject or level spread out through the different schools in the county from both teachers teaching by standards based practices and those not.
    Measurement of IV: A survey of teaching practices to determine standards based or not.
    Measurement of DV: Students will be given a test at the end of each grade/subject level to test fluency of topics.
    Demographics:
    • school size, locale, school diversity data with SES, gender, race
    • teacher data including diversity, education level, experience level

    • Jessica Ordonez May 27, 2018 at 8:33 pm

      Sarah,
      I find your question and study ideas to be interesting. I worry that finding 100s of students will be difficult. Have you considered narrowing your focus from grades 7-12 to something more manageable?

    • Julia Wenzel-Huguley May 27, 2018 at 10:28 pm

      Sarah- I think you thought this through very well! Just a question, and maybe because I’ve been in ELA and now German that I don’t understand the scope when it comes to math, but is there a reason your population and sample draw from 7-12? I just wonder if it wouldn’t be better to focus on 10/11, since isn’t that usually the time they’re taking the ACT/SAT? I’m just wondering if it would skew your information because there are a lot of students in that range that wouldn’t be taking the ACT/SAT. Just something to consider, maybe?

      • Sarah Anderson May 28, 2018 at 6:37 am

        The reason for 7-12 is that content standards for math are done in grade bands, where content overlaps. Some students begin taking Algebra 1 in middle school, and my area of focus is secondary mathematics. I’m also interested in seeing if the grade level factors into it. In all reality, at the moment, I am planning to do a qualitative study for my dissertation.

  31. Julia Wenzel-Huguley May 27, 2018 at 7:48 pm

    The effects of World Language study and academic achievement
    Looking to answer the question: “Does taking a world language affect student achievement?”

    Do high school students who take four years of a world language have higher ACT writing scores?

    Population: High School Seniors, particularly in Kansas
    Sample: Dependent on the number of high school seniors that have taken four years of a world language, as large of a sample size as possible, better if taken from more than one school in more than one district
    Measurement of IV: Counting of students, categorized based on world language participation or not
    Measurement of DV: Collection of ACT writing scores to be analyzed
    Demographics: School size, male/female ratio, native language first spoken, average GPA

    How many students in a class of seniors take four years of a world language?

    Population: High School Seniors, particularly in Kansas
    Sample: High school in suburban Kansas
    Measurement of IV: Counting of students in their senior (4th year) of high school
    Measurement of DV: Counting of students, sorting into categories of those that have/have not taken a world language for four years
    Demographics: School size, male/female ratio, native language first spoken, average GPA

    • Jessica Ordonez May 27, 2018 at 8:21 pm

      Julia,

      What is your plan for obtaining data from more than one high school in one district?

    • Melissa Jolley May 27, 2018 at 9:05 pm

      This is a super interesting topic.I have wanted to look into the affects of being bilingual on the acquisition of communication skills. My mother grew up in Japan and did not learn English until High School. I noticed that she has very poor communication skills in either language. My daughter attended Japanese school all the way until the 7th grade and while there, we met many 1/2 children (half American, half Japanese) and I started to see the same lag in language. It was almost as if they could only acquire so much of a language, but they could never be masterful in either. Some of the subtle nuances of language were missed.

      Some people are naturally gifted in language and can pick them up quickly (my husband and daughter), while others have more difficulty with linguistics in general.

      I also think that it is interesting that you are choosing writing as your DV. My daughter could write more beautifully than a native Japanese kid (her writing samples were even published in a Japanese newspaper several times!) yet, she never felt fluent or could speak in a casual manner. I had always thought that formal written language was more difficult to acquire. Language is so complex!
      Can’t wait to see what you find!

    • Kelsey Walker May 27, 2018 at 10:56 pm

      For research purposes, I am wondering if your second question would become more valuable if you could figure out how the scores of those students who DO take the foreign language show a benefit, like you did in your first question. Your second question would not justify further research or a change in policy or practice given the results, but I think where you are going with this COULD be really important research. One direction you could take this, one I am super interested in, is how many students in a class of seniors who DO take a world language experience a higher college application acceptance rate/percentage? It would be interesting to know how taking these foreign languages classes benefit students’ appearance to colleges. Just a suggestion! I am very interested in learning your findings for your first question; any time I have learned about another language, it has helped me significantly with my own language!

    • Travis Jolley May 28, 2018 at 12:27 am

      Julia,
      Your topic of world language study and ACT writing scores is appealing to me. I have heard that studying foreign languages can have many positive effects even in the native language. In your study I am wondering if when you say four years of a world language you mean the same language studied 4 years consecutively. For example, in high school my brother took a mixture of Latin and Spanish courses. Also, I am wondering if you are primarily interested in seeing the ACT scores from the students’ senior year after the foreign language courses have been completed. I’m looking forward to seeing your study in action!

    • Carol Powell May 28, 2018 at 11:02 am

      Julia,
      This is a very interesting topic. Do you have an expectation of what your research will find? How will you determine whether taking a language affects achievement or higher achieving students tend to take language courses?

  32. Janie Evans May 27, 2018 at 9:49 pm

    I am not sure if I am doing this correctly so any feedback is appreciated!

    Research Question: What are the barriers to ELA RTI Tier 2 success in high school?
    (1) Quantitative Question #1: Do ELA RTI Tier 2 interventions (IV ) in grade 9 improve ELA EOC scores (DV) in grades 9-11?

    Population: All past and present Tier II students in grades 9-11 spanning three years in three high schools.

    Sample: A stratified sample of at least 25 students from three high schools—each with a distinctly different demographic and population.

    Measurement of IV: At least 2 research based ELA RTI Tier 2 interventions will be studied and measured looking at the amount of time allotted for students to work on these specific interventions.

    Measurement of DV: The EOC test are already quantitative and available upon request.

    (2) Quantitative Question #2: Do ELA RTI Tier 2 interventions in grade 9 improve ACT reading and English scores?

    Population: All 11th grade students who received ELA RTI Tier 2 interventions in the 9th grade in three high schools.

    Sample: A stratified sample of 25 students from each of the three high schools.

    Measurement of IV: At least 2 research based RTI Tier 2 interventions will be measured, looking at the amount of time students spend working in these interventions.

    Measurement of DV: The ACT reading and English scores are already quantitative and available upon request.

    • Janie Evans May 27, 2018 at 9:58 pm

      I noticed I forgot to include my demographics in my response. My demographics would be school size, diversity data, number of students in ELA RTI Tier 2 interventions, number of teachers and/or assistants in each program including diversity, educational level, experience, and training.

    • Travis Jolley May 28, 2018 at 12:42 am

      Janie,
      I am glad to see you are studying the effectiveness of RTI. This is a critical component of our goal of providing for the needs of all students. I was wondering why you specifically are interested in looking at students who received interventions in 9th grade. It is my understanding that RTI interventions can be implemented at any grade in high school which would seemingly affect the EOC scores. Also, your second question says the population is all 11th grade students who received tier 2 interventions. Is this the only year you would be looking at their ACT scores? I wish you success on your research!

      • Janie Evans May 29, 2018 at 8:57 pm

        Travis: Thank you for the feedback. I am looking at 9th grade interventions since this is such a critical year for high school success. Yes, you are correct about RTI implementation; however, in reality, 9th grade is the typical year that high schools use RTI Tier 2 instruction. In our district, we have students specifically scheduled for RTI Tier 2 Interventions with a classroom teacher. RTI Tier 2 scheduling is problematic in high school. So, I want to know if what we are doing is actually working, and if not, what do we change? It seems that our Tier 2 students who do test out of RTI continue to struggle in their classes. As to your second question, all students must take the ACT their 11th grade year. While, we do have some students take the exam earlier, it isn’t many. I want to know if students are getting those foundational skills in 9th grade that set them up for success. I feel like RTI should address these needs and we have to find what works.

  33. Kelsey Walker May 27, 2018 at 10:50 pm

    Do teachers of students in urban schools in grades 6-12 who have taken at least one behavioral training course on mental health (IV) during their teacher preparation program have fewer behavioral referrals (DV) resulting in in-school or out of school suspensions than teachers in the same grade levels who have had no courses in behavioral training?
    Population: Grades 6-12 teachers across East Tennessee
    Sample: A stratified sample of at least 100 teachers, an equal number of teachers from each grade level 6-12
    Measurement of IV: Training on Mental Health will be measured using teacher surveys, questions will be developed to make the research consistent for reporting on mental health training (for instance, once question may read “Have you participated in a Mental Health Training Course specifically designed for teachers for at least one hour?”
    Measurement of DV: Referrals will be Recorded using Administrative Data for the teachers that participated in the surveys for the IV.
    Demographics: school size, college prep programs for teachers involved, gender, race, documented disabilities, education location and level, years of experience, school locale, school diversity, behavior programs offered

    Do teachers of students in urban schools in grades 6-12 with written positive behavior management plans (IV) have fewer behavior referrals (DV) resulting in in-school or out of school suspensions than teachers who implement punitive classroom management plans?
    Population: Grades 6-12 teachers across East Tennessee
    Sample: A stratified sample of at least 100 teachers, an equal number of teachers from each grade level 6-12
    Measurement of IV: Positive Behavior Plans will be measured by evaluating written classroom management plans distinct positive or punitive policies according to general knowledge of the two types of management based on literature review
    Measurement of DV: Referrals will be Recorded using Administrative Data for the teachers that participated in classroom management evaluation for the IV.
    Demographics: school size, gender, race, education location and level, years of experience, school locale, school diversity, school-wide policies for discipline

    • Travis Jolley May 28, 2018 at 12:10 am

      Kelsey,
      You have clearly given a lot of thought to planning your questions. I also want to account for many of the demographics of your participants including education level of teachers, student diversity, and school locale. I am wondering if in your study you are planning to include teachers who may teach multiple grade levels such as related arts or special education teachers. Also, in your sample I would seek to include teachers from all types of schools including magnet and alternative. Your proposal looks promising!

  34. Travis Jolley May 27, 2018 at 11:51 pm

    Question 1: What levels of per pupil spending [IV] do schools maintain in relation to standardized test achievement [DV]?
    Population: All public schools and their students across Tennessee.
    Sample: All public schools and their students in Hamilton County, Tennessee.
    Measurement: Per pupil expenditures by school in relation to TNReady assessment data by school.
    Demographics: School size, affluence of the community, percentage of special education students, ELL student population, grants received, school type (traditional, magnet, alternative, etc.), diversity levels

    Question 2: What is the relationship of teacher effectiveness ratings [IV] at schools compared to student performance [DV]?
    Population: All public-school teachers and students in Tennessee.
    Sample: All public-school teachers and students in Hamilton County, Tennessee.
    Measurement: Project COACH evaluation scores in Hamilton County at all schools in relation to their TNReady assessment data.
    Demographics: Teacher experience level, teacher education level, percentage of special education students, ELL student population, school type (traditional, magnet, alternative, etc.), how teachers are assigned, administrator experience with Project COACH, percentage of students performing below grade level according to RTI data.

    • Jordan Reed May 31, 2018 at 5:21 pm

      I’m not familiar with Project COACH. Are the evaluation scores more objective or subjective? What are their criteria for evaluating teacher effectiveness?

  35. Matthew Smith May 28, 2018 at 9:06 am

    1. How does chronic absenteeism (I/V) impact student GPA (D/V) and ACT scores (D/V)?
    a. Population: Hamilton County 12 grade students (previously taken ACT)
    b. Representative Samples: stratified sample of at least 100 12 grade students in Hamilton County
    c. DVs: GPA and ACT scores are already collected and quantified and can be pulled with HCDE approval
    d. IVs: Absenteeism can be pulled from PowerSchool and Cognos and can be gathered with HCDE approval
    e. Demographics: school size, school location (Urban, Rural), ethnicity, gender

    2. How does chronic absenteeism (I/V) impact student performance in meeting ACT college readiness benchmarks (D/V)?
    a. Population: Hamilton County 12 grade students (previously taken ACT)
    b. Representative Samples: stratified sample of at least 100 12 grade students in Hamilton County
    c. DVs: ACT scores are available and can be compared to TN determined ACT readiness benchmarks (HOPE scholarship qualifications)
    d. IVs: Absenteeism can be pulled from PowerSchool and Cognos and can be gathered with HCDE approval
    e. Demographics: school size, school location (Urban, Rural), ethnicity, gender

    • Carol Powell May 28, 2018 at 10:52 am

      Matthew,
      Your topic reminds me of the Rita Pierson videos on the topic from previous classes. It looks like you have a good plan for an ex post facto research study and it is great that you can access the existing data for your study. I believe it is an important topic. As an advisor at a community college orientations for new students, I see so many students that require developmental classes in reading, writing, and/or math (based on low ACT scores) before they can take the college level courses.

    • Katrich Williams May 28, 2018 at 8:51 pm

      Matthew,
      PowerSchool is actually a great tool that keeps up with student absenteeism. I am pretty sure it would be useful in terms of measuring. Your topic seems like one that could provide a lot of data.

  36. Amy Perry May 28, 2018 at 7:23 pm

    1. What is the relationship between screen time (IV) in school and academic achievement/ success (DV)?
    Population: 3rd through 5th graders in Rutherford County, TN
    2. Sample: 3rd through 5th graders in 5 different schools
    3. The IV (screen time) will be measured via survey and data collection (website/ instructional program recording).
    4. The DV(achievement) will be measured through TNReady scores and through benchmark assessments given by the county.
    5. Demographics will be needed of the entire county including grade, teacher effect, teacher experience, gender, student gpa, student socioeconomic status, & ethnicity. . These same demographics will be needed for the sample group.
    6.
    2. Is student satisfaction/ perception of school (DV) affected by time spent on devices (IV) during the school day?
    1. Population: 3rd through 5th graders in Rutherford County, TN
    2. Sample: 3rd through 5th graders in 5 different schools
    3. The IV (time spent on devices) will be measured via survey and data collection (website/ instructional program recording).
    4. The DV(student satisfaction/ perception) will be measured through student survey.
    5. Demographics will be needed of the entire county including grade, teacher effect, teacher experience, gender, student gpa, student socioeconomic status, & ethnicity. These same demographics will be needed for the sample group.

    • Katrich Williams May 28, 2018 at 8:34 pm

      Amy,
      I will also be using surveys. I think those are some of the best tools to use for quantitative data.

  37. Katrich Williams May 28, 2018 at 8:31 pm

    • Restate your QUANTITATIVE RESEARCH QUESTIONS – you may choose to make changes to them based on previous work. That is your decision.
    o Will female teenagers who are pregnant [IV] become high school dropouts [DV] versus female teenagers who are not pregnant?
    o How many supplies and resources are available for teenaged girls who are pregnant [IV] to help prevent them from becoming high school dropouts [DV]?
    • For both questions, identify the population to which you want to generalize the results of your study.
    o Female teenagers who are pregnant and female teenagers who are not pregnant
    o Teenaged girls who are pregnant
    • Then identify the representative samples you would study in order to complete the study
    o Teenaged mothers with IEPs
    o Teenaged mothers in single parent homes
    o Teenaged mothers with behavioral problems
    o Race of the teenaged mothers
    o What companies are providing supplies and resources
    • Identify how you will measure/quantify the DVs and IVs.
    o DVs will be measured by results from surveys, questionnaires, and interviews
    • Brainstorm a list of demographic info that you would need to understand about your sample.
    o Race
    o Family Income
    o School Size

    • Jessica Ordonez May 28, 2018 at 11:03 pm

      Katrich,
      What an amazing question! Where will you find teenage mothers? In your district? State?
      The Nation?
      Another demographic to consider may be the age of the teen mother. The data might be different for a 15-year-old vs. a 17-year-old.

  38. Jordan Reed May 31, 2018 at 5:16 pm

    Best practices on teaching students with learning differences such as Autism, ADHD, TBI, etc.

    Quantitative:
    1: In what ways does the implementation of a positive behavior intervention system (Mystique Points System) [IV] cause a reduction in adverse student behavior [DV]?
    Population: Students with special needs
    Sample: Students at The Edison School
    Measurement of IV: Mesurements taken via Mystique Points System
    Measurement of DV: Teacher observations / survey, office referrals
    Demographics: Student behavior data, class size, academic content being covered, student age

    2: What effect does smaller class sizes (student:teacher ratio of 7:1) [IV] have on achievement levels of students with learning differences [DV]?
    Population: Students with special needs
    Sample: Students at The Edison School
    Measurement of IV: Student:Teacher Ratio
    Measurement of DV: IOWA test scores, Aimsweb benchmarking / progress monitoring data
    Demographics: Student SES, length of time enrolled at The Edison School, amount of time spent on student, school size

  39. Tina Shepherd June 1, 2018 at 4:15 pm

    Wikki 4
    Topic: Lesbian, gay, bi-sexual, transgender and queer (LGBTQ) student in the academic setting
    Quantitative questions:
    1. Do LGBTQ students have higher safety risks (IV) in school (DV) than other students?
    Population: High School Students
    Sample: A purposive sampling of high school students from research on line and Davidson
    County schools.
    Measurement: Safety risks will be measured by number of discipline referrals/incidents
    reported in the schools
    1. Demographics: Student data including gender, school locale, race, school size, teacher and student data

    2. Do LGBTQ students face discrimination more in the 21st century (IV) in the school setting (DV) than they did 15 years ago?
    Population: High School Students
    Sample: A purposive sampling of high school students from research on line and Davidson
    County schools.
    Measurement: Discrimination will be measured by number of cases reported to the school districts and teacher and student surveys.
    2. Demographics: Student data including gender, school locale, race, school size, teacher and student data

  40. Debbie Booker June 1, 2018 at 4:17 pm

    Topic: The necessity of nontraditional schools for at risk students

    Quantitative questions:
    1) Do at-risk students perform better academically (IV) at nontraditional/alternative schools (DV) than traditional schools?
    Population: high school students
    Sample: A purposive sampling of high school seniors in Davidson County Tennessee.
    Measurements: TnReady test scores, report cards, growth rate
    Demographics: school size, location of school, race, gender

    2) What are the distinctive processes nontraditional schools use (IV) that are effective at improving the academic performance of at-risk students (DV)?
    Population: High school teachers
    Sample: A purposive sample of high school teachers in nontraditional/alternative and traditional schools
    Measurements: Survey teachers about their instructional strategies, graduation rate
    Demographics: school size, location of school, race, gender

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s