Analysis - Reflection

Artifacts

  1. RCAD Analysis Summary (Fall 2022- EDIT 6170e) and
  2. Data Analysis Report for a Technology Integration Project (Fall 2024 - EDIT 7350e)

Analysis, for me, is where design begins. It is the space where curiosity meets structure, where I try to make sense of messy, sometimes conflicting information and translate it into a clear path forward. Much like how I could get engrossed in a mathematics problem that I am trying to solve. The two artifacts I selected for this theme, the RCAD Analysis Summary (Fall 2022, EDIT 6170E) and the Data Analysis Report for a Technology Integration Project (Fall 2024, EDIT 7350E), illustrate how my analytical practice has evolved from guided teamwork to independent, evidence-driven evaluation. They mark two points in my IDD journey; one where I was learning to navigate complex institutional systems within a team, and another where I was trusted to independently interpret and synthesize data into meaningful insights.

After taking some time away from coursework after Fall 2017 for personal responsibilities, I returned to the program with renewed focus. Revisiting my RCAD Analysis Summary from Fall 2022 reminded me of how formative that project was to my development as an instructional designer. It was my first major experience working within a large design team, responding to a client request from the University of North Georgia to create a first-year student success course. Our task was to diagnose performance gaps, identify learner needs, and define measurable instructional goals. We applied the ADDIE framework to organize our analysis and engaged in several client meetings to clarify expectations and contextual realities. I believe this was also my first experience in practically applying the ADDIE framework. 

My specific contributions included creating resource charts for hybrid and online synchronous delivery models and drafting the introduction to the final report. I learned quickly that analyzing a learning problem at an institutional level required attention to both the human and structural elements. We examined learner characteristics, motivation patterns, and the contextual constraints surrounding student support systems. To ensure rigor and alignment, our team mapped outcomes to Bloom’s taxonomy, building a logical progression from basic awareness of campus resources to higher-order self-management strategies that supported retention and engagement.

That project taught me how analysis is not just about describing data but about interpreting meaning in context. Through collaborative meetings and shared revisions, I learned to listen closely to feedback from clients, peers. Each member of our team brought a different lens, and the synthesis of those perspectives made our findings richer and more grounded. Working across time zones and schedules also tested my project management and communication skills. Those experiences reinforced the ibstpi expectations related to professional conduct, collaboration, and data-driven decision-making. In many ways, the RCAD project gave me a foundation for systems thinking; understanding how individual learner behaviors connect to institutional structures and policies.

Two years later, that foundation became essential when I approached my Data Analysis Report for a Technology Integration Project (Fall 2024). This time, I worked independently, analyzing data from a two-year mentorship program aimed at supporting K–12 teachers in adopting classroom technologies. Although the context differed, the evaluative mindset felt familiar: diagnose the current state, identify gaps, and recommend feasible improvements.

The project combined survey data, observation notes, and interview transcripts, giving me a rich mix of qualitative and quantitative evidence. My first step was to clean and organize the data so I could understand what story it might be telling. Using descriptive statistics, I examined patterns in participants’ confidence, frequency of technology use, and perceived institutional support. Then, through thematic coding of open-ended responses and interview data, I identified recurring challenges like limited collaboration time, uneven administrative follow-up, and lack of visible recognition for mentor contributions.

What I found most engaging about this process was how each data type illuminated a different layer of reality. The numbers pointed to general trends, while the narratives revealed why those trends existed. Through triangulation, I could see where the quantitative and qualitative data converged, helping to validate my interpretations. Based on this synthesis, I developed a set of recommendations emphasizing structured peer mentoring, regular reflection checkpoints, and leadership visibility.

This analytical process mirrored the kind of evidence-based reflection I now apply in my faculty role when examining student engagement and instructional data. I have always believed that teaching and design share a common thread which is the need to observe, interpret, and adjust based on feedback. The Data Analysis Report reminded me that data is most powerful when paired with empathy and contextual awareness. Although I was initially nervous about managing multiple data sources on my own, I came to enjoy the problem-solving aspect. I really liked how Dr. Bagdy designed the phases for this project as it enabled us to take a more structured approach.  I was not just asking what the data said, but what it meant for improving human performance. Coming to think of it this connects so beautifully to the course I later took in Spring 2005 with Dr. Stefaniak on Human Performance Technology. 

Both the RCAD Analysis Summary and the Data Analysis Report highlight my growth from guided collaboration to independent analysis. In the RCAD project, I learned to interpret data through teamwork, aligning diverse perspectives to create a coherent narrative for institutional improvement. In the technology integration project, I learned to apply that same discipline on my own by making analytical judgments, justifying decisions, and synthesizing evidence into actionable recommendations. Together, these experiences shaped the way I approach analysis: as both a structured process and a creative, interpretive act.

These projects also prepared me for more advanced work in EDIT 7150E, where I conducted a needs assessment for the Center for Teaching Excellence (CTE). The habits I developed with triangulating multiple data sources, working through ambiguity, and translating findings into realistic design strategies, directly informed how I later approached institutional surveys and stakeholder reports.

As a faculty member in higher education, analysis, for me, is no longer just a front-end task. It is an ongoing responsibility to stay responsive to data, context, and people. I now approach any new design challenge by asking: What evidence do I have, and what does it tell me about the learners and the system they operate in? That question guides every recommendation and changes I make to my teaching.