Categories
Uncategorised

Students as Partners in Research – NET 250

Aidan Beck, Lecturer, Newcastle University Business School

What did you do?

Newcastle University Business School was asked to collaborate with North East Times and PwC on a research project called “NET 250”, initiated by Fiona Whitehurst (Associate Dean Engagement and Place) and Sarah Carnegie (Director of Employability). PwC are a multinational organisation offering professional services, and one of the “Big Four” accounting firms, whilst North East Times are a local media organisation who showcase businesses in the region through their business publication. This was therefore a significant project to be involved in, working with two major stakeholders.

The “NET 250” project aimed to highlight the top 250 companies by revenue in the North East and celebrate the success of business in the region through a high profile awards event hosted by North East Times on 14th May 2025 at Baltic Centre for Contemporary Art. The output of the research project was also included in the May 2025 edition of the North East Times’ “N” magazine. The event was attended by over 200 people [1], which included many business leaders and senior personnel from organisations around the region.

My role was to act as the project lead and supervise four students who were recruited for paid work between October and December 2024 as partners in research for the project. The students, because of the agreed scope of the project and the criteria for a company to be eligible, were required to research several datapoints for many companies in the region.

Students used publicly available information, financial databases and Companies House to gather the information and documented their work in a spreadsheet. This spreadsheet was then used to compile the final list of 250 companies which was refined and agreed by North East Times, PwC, Fiona Whitehurst (Associate Dean Engagement and Place) and myself.

The output of the students’ work was therefore crucial in the production of the final list of 250 companies which was revealed at the “NET 250” awards event and published in North East Times’ “N” magazine.

Who was involved?

Aidan Beck (Lecturer in Accounting & Finance), collaborating with those who are responsible for impact and engagement at Newcastle University Business School, working in a cross-school partnership.

Four students – consisting of two postgraduate and two undergraduate students from a mixture of international and home, and each studying a different degree programme (MBA, MSc Economics and Finance, BSc Economics and Business Management and BA Business Management).

How did you do it?

The four students were recruited following an application process which first required the submission of a CV and a cover letter which highlighted relevant experience for the project, for example their ability to work in a team, experience in using financial databases and analysing financial data. A shortlist of candidates was then compiled, and interviews were held in October 2024 with senior managers in the business school.

An initial briefing meeting was held towards the end of October 2024, during which the students were briefed on the project timeline and their role as well as engaging with and agreeing the scope and approach with PwC, who also attended the meeting.

Following the initial briefing, the students then worked independently and managed their own time around their university commitments to undertake the research. They used financial databases and publicly available information to gather the data, and then used Companies House to ensure the key data required was factually accurate.

Although revenue was the key metric, there were several additional criteria to consider and data required to collect to ensure the agreed scope was followed, and accurate data documented. These included whether the company has significant decision making and obvious presence in the region, the date on which accounts were submitted to Companies House and whether the company was a subsidiary or related company of another within the list.

I met weekly with the students throughout the research project (alternating between online and in-person) to discuss the students’ progress, investigate any anomalies identified and answer any questions they had.

Following the completion of the students’ work, I performed a comprehensive review of the list, submitting to PwC in January 2025 for the data to be verified. Subsequently, meetings were held in 2025 with North East Times, PwC, Fiona Whitehurst and myself to refine and agree a final list.

Did it work?

Overall, the project itself was a success, with the students describing it as a challenging and rewarding experience and one which has developed their critical thinking, teamwork, data research and financial analysis skills. Developing student skills as researchers is increasingly important as they progress through their studies. The “NET 250” project will therefore help improve student engagement with research as part of their studies.

In February 2025, two press releases were published[1][2] which launched the “NET 250” campaign, outlined its scope and mentioned the students and staff at Newcastle University Business School involved. The same people were then named and thanked at the “NET 250” awards event and in the editor’s welcome of the May 2025 edition of the North East Times’ “N” magazine. Not only does this highlight the appreciation and value of the efforts of the students and colleagues involved but it is also positive for Newcastle University Business School to be a key partner for a project which has real meaning and significance for companies across the region.

Engaging in this type of project requires considerable effort from both the staff involved and the students, especially when the project is a new venture for all those involved. As the project leader and supervisor of the students, I had ultimate responsibility for how the project was run, dealing with any performance issues and both the quality and accuracy of the research output. Furthermore, my role was to support the students and coordinate the team, review the students’ work and liaise with the key stakeholders (PwC, North East Times). With the students all studying on different degrees, a logistical challenge was making sure everyone was free at the same time when organising meetings. Additionally, engaging closely and regularly with the key stakeholders and Fiona Whitehurst (Associate Dean Engagement and Place) was important firstly to ensure the project scope was followed, but then secondly to refine and finalise the “NET 250” list.

This type of project was new to me as well, so the support and guidance of those more experienced in engagement activities at Newcastle University Business School, was equally important to the success of the project.

In terms of the students, because this type of research project was new to them, they required support and guidance throughout the project from myself as project leader. In addition, the project was extra-curricular, which meant the students had to balance the workload of the project with their independent studies, assessment commitments and timetabled activities, whilst working to a challenging project timeframe.

In conclusion, through working with two important stakeholders (PwC and North East Times), the press releases, the “NET 250” awards event and the publication of the research output in the May 2025 edition of the North East Times’ “N” magazine, this project has been both significant and impactful within the North East region. It has been beneficial for the students’ development and the School’s reputation. The project has also helped develop my scholarship and engagement, in addition to inspiring me to adapt this type of research into the classroom in the future, through experiential learning (which is discussed in the next section).

Next steps?

As mentioned above, some of the benefits gained by the students were critical thinking, data research and financial analysis skills. As a result, I would like to integrate this learning into other areas.

For example, I plan to incorporate a variation of this research process into a non-specialist accounting and finance module, whereby students, playing the role of business managers, can use financial databases and publicly available accounting information to evaluate the performance of businesses, in comparison with its competitors and other key companies within the industry.

The objective of this is to adopt experiential learning by using real-world examples and financial databases to increase accounting literacy and highlight the importance of accounting information for those in business who aren’t accountants.

The Graduate Framework

This project helped the students develop the following skills from Newcastle University’s graduate framework:

Engaged – the students were all fully committed to the project, working hard throughout and were receptive to feedback.

Collaborative – the students had to work flexibly to a challenging timeframe, around their existing University studies, and ensure they communicated queries and findings clearly.

Curious – the students applied a questioning mindset throughout the project, always keen to learn more about accounting information and often asking “why.”

Digitally Capable – the students utilised publicly available information, financial databases and Companies House to gather the required information.

[1] https://bdaily.co.uk/articles/2025/05/14/the-net-250-the-north-easts-top-250-businesses

[2] https://bdaily.co.uk/articles/2025/02/12/newcastle-university-business-school-partnerships-on-launch-of-net-250

[3] https://www.ncl.ac.uk/business/news/school-partnerships-with-net/

Categories
Uncategorised

What have I learned?: Student reflections of Work Integrated Learning

Sarah Carnegie, Senior Lecturer, Newcastle University Business School

The 40-credit capstone module, BUS3053, Management Consultancy Project is available as an option to students on Business Management, Marketing and Management and Accountancy and Finance undergraduate programmes.  The module can be characterised as an example of work integrated learning (WIL) as it is “embedding learning activities and assessment that involves students in meaningful industry and / or community engagement” (Jackson and Bridgstock, 2021, p.726).  In each academic year the students from these programmes are allocated to small teams, ideally 8 in each team, to engage with an external client.  Each client is assessed by the School as having a suitable business issue or problem for the students to tackle and therefore provides an authentic, live work experience (Dollinger and Brown, 2019). 

The value of WIL is widely appreciated in existing research with positive outcomes including, supporting student capacity to take responsibility for their work (Caldicott et al., 2022), fostering work values and human capital (Ng et al., 2022), and developing a range of skills such as teamworking, communication and critical thinking (Jackson and Bridgstock, 2021).  Additionally, the importance of developing professional readiness has been noted (Jackson and Bridgstock, 2020, Jackson and Tomlinson, 2022).  Whist definitions of such readiness will vary depending on career direction, it is acknowledged that students require certain common skills, so they can “emerge as professionals; navigate relationships with others; and build their sense of self” (Caldicott at al., 2022, p. 388).

However, are modules such as BUS3053 truly offering opportunities for students to achieve these positive outcomes?  In May 2024 the Institute of Student Employers (ISE) highlighted that the gap between employer expectations and graduate behaviours is widening.  Just under half (49%) of employers reported that graduates were career ready at the point of hire (a decrease from 54% in 2023).  Whilst there may also be an extent to which employers lack understanding of the student experience, concerns about how well students understand what life beyond university will be like and have developed the necessary career skills to navigate this, have been voiced for some time (Bridgstock, 2009).  What impact can such modules have in helping students apply their learning and development of skills in workplace environments?

This module includes a 2-hour ‘celebration and reflection’ event is scheduled for the week following the submission of the outcomes of their group work; the Client Report (40%) and Client Presentation (10%).  This event is designed to encourage active reflection and support the remaining 50% of the module; an individual reflective assignment.  This assignment asks the students to discuss the learning gained from the module.  As Ryan (2013, p.144) states ‘learners are not often taught how to reflect’, so the intention of the event is to provide a relaxed and informal opportunity to think and talk about what they have experienced working on the module.

During the ‘celebration and reflection’ event the students sit with their team colleagues and are provided with roughly divided sheets of flip chart paper and a range of coloured pens, markers and highlighters.  They are asked to draw or comment, however they wish to, in response a series of questions about what they have learnt, with at least 10 – 15 minutes being allowed for chatting and sharing of thoughts.  As Lengelle et al., 2016, p. 106) comment good reflection should be “stimulating a playful, creative process that fosters a sense of fun and competence”.

At the event held in March 2025, students were asked to reflect on various elements of their experience on the module including two specific questions,

  • ‘what skills have I learnt or developed further (during the project)?
  • ‘what have I learnt about being a professional (during the project)?

Student self-identified, in pictures or words, particular skills and learning points. 

The results from the 30 students who attended to the question ‘what skills have I learnt or developed further’ were:

Figure 1 – Skills mentioned more than 2 times

Figure 1 – Skills mentioned more than 2 times

The responses from the 30 students who attended to the question ‘what have I learnt about being a professional’ were more wide-ranging.  The responses have been categorised as follows.

Responses focusing on specific practical skills,

  • Email – 16 mentions related to setting up email signatures, email etiquette, drafting professional emails and learning email ‘dialect’
  • Planning – 6 mentions related to project planning, drafting plans and schedules, and setting up invites for meetings using Teams

Responses focusing on workplace behaviours,

  • Attending meetings – 10 mentions related to being punctual, preparing for and attending meetings

  • Personal attributes – 9 mentions of personal attributes such as taking initiative, resilience, respect for others, leadership and perseverance

  • Managing work – 8 mentions related to delegating, taking responsibility, problem solving and managing deadlines

  • Developing social capital – 4 mentions of networking
Figure 2 – Student representations of skills and attributes

Figure 2 – Student representations of skills and attributes

The responses provide insights into how the students understand ‘skills’ and ‘being professional’, with both practical functional skills and personal attributes and behaviours being described for both.  It is interesting that ‘patience’ has been reported as one of the main behaviours learnt, ranging to what could be considered an everyday practicality of how to ‘email’, also highlighted by a significant number of the students.

The reflections illustrate that the module is helping students to become career ready in a holistic way.  They can recognise specific skills they have learnt and have gained an awareness of what will be required of them in a professional work environment.  The module provides a relatively ‘safe’ space where they have to quickly learn relevant skills and appropriate professional behaviours to be able to engage with their clients.  However, we can also consider how some of the skills highlighted could be integrated earlier into their learning and assessment, so that some of the more everyday practices could be embedded earlier, so they can focus on developing their deeper workplace interactions with clients with more confidence.

Reference List

Bridgstock, R. (2009) The graduate attributes we’ve overlooked: enhancing graduate employability through career management skills, Higher Education Research & Development, 28:1, 31-44,

Caldicott, J., Wilson, E., Donnelly, J. F., & Edelheim, J. R. (2022). Beyond employability: Work-integrated learning and self-authorship development. International Journal of Work-Integrated Learning, 23(3), 375.

Cunliffe, A.L. (2016) Republication of ‘On Becoming a Critically Reflexive Practitioner’, in Journal of Management Education, Vol 40(6), pp 747 – 768.

Dollinger, M., and Brown, J. (2019). An institutional framework to guide the comparison of work-integrated learning types. Journal of Teaching and Learning for Graduate Employability, 10(1), 88–100.

Institute of Student Employers (2024) Is career readiness in decline?https://ise.org.uk/knowledge/insights/195/is_career_readiness_in_decline Accessed 30/05/25

Jackson, D., and Bridgstock, R. (2021). What actually works to enhance graduate employability? The relative value of curricular, co-curricular, and extra-curricular learning and paid work. Higher Education, 81(4), 723–739.

Jackson, D., and Tomlinson, M. (2022). The relative importance of work experience, extra-curricular and university-based activities on student employability. Higher Education Research and Development, 41(4), 1119–1135.

Lengelle, R., Luken, T. & Meijers, F. (2016) ‘Is self-reflection dangerous? Preventing rumination in career learning’, Australian journal of career development, 25(3), pp. 99–109.

Ng, P.M.L., Wut, T. M., and Chan, J. K. Y. (2022). Enhancing perceived employability through work-integrated learning. Education & Training (London), 64(4), 559–576.

Ryan, M. (2013) The pedagogical balancing act: teaching reflection in higher education, in Teaching in Higher Education, Vol. 18, No. 2, 144-155.

Categories
Uncategorised

Rethinking the Newcastle University Business School (NUBS) Approach to Professional Body Relationships: A Critical Examination of Student Value

Dr Kirk Dodds, Senior Lecturer, Newcastle University Business School

Phil Morey, Senior Lecturer, Newcastle University Business School

Dr Kirsty Munroe, Lecturer in Accounting, Newcastle University Business School

Overview, Context and Literature

Universities often collaborate with professional bodies to secure accreditations for a range of business degrees. While these accreditations and their associated “badges” are valuable for marketing purposes, the potential for such partnerships to contribute long-term value to the curriculum and student experience remains an underexplored area (Johnson & Ulseth, 2017; Izuegbu, 2007).

For instance, our relationship with the Direct Marketing Association (DMA) exemplifies how professional body partnerships can transcend basic accreditation (Pepple et al. 2025). Through initiatives such as professional boot camps, inclusive standards addressing hidden disabilities, exam exemptions, and industry engagement opportunities like judging panels, the DMA collaboration has enriched the student experience in tangible ways.

The significance of this partnership lies in its ability to go beyond the mere acquisition of a badge (Dražeta, 2023). It raises a critical question: How can relationships with professional bodies meaningfully enhance the student experience? This paper explores the transformative impact of such collaborations, offering a critical examination of MSc Digital Marketing students’ perspectives on their engagement with the Data & Marketing Association (DMA).

Methods and Thematic Data Analysis

A focus group was conducted in April 2025 with 12 of our MSc Digital Marketing students, taking part in April 2025. The session followed a semi-structured approach (Ruslin, et al 2022), with themes drawn from existing literature to guide the questions (Ruslin, et al 2022). The focus group lasted 1.5 hours, and the data was analysed using template analysis (King, 1998) to identify key themes from the student perspectives on relationships with professional bodies.

Through thematic analysis, six major themes emerged:

  1. The role of university ranking in student decision-making.
  2. The alignment of professional qualifications with academic degree content.
  3. The value students place on practical learning opportunities provided by professional bodies.
  4. Student desire for more practice-based digital marketing content.
  5. Opportunities for increased face-to-face engagement between students and the professional body.
  6. The importance of assessments being aligned with industry-relevant practices.

Each theme drawn from the focus groups revealed key insights and conclusions:

1. The Role of University Ranking in Student Decision-Making

Contrary to some findings in the literature (Johnson & Ulseth, 2017; Izuegbu, 2007), students indicated that professional body accreditation was not a key factor in their decision-making during university selection. Instead, significant emphasis was placed on university rankings, particularly the league tables and membership in the Russell Group. Most students admitted they were unaware of the professional body at the application stage and only became aware of it after commencing their studies.

2. The Alignment of Professional Qualifications with Academic Degree Content

A small number of students expressed interest in undertaking professional qualifications, noting the transferable benefits after graduation. However, they also highlighted a disconnect between the level of the qualifications (typically Level 4) and their postgraduate degree (Level 7). Half of the focus group stressed the importance of qualifications being aligned with the academic level of their degree, while the other half felt the level itself was less relevant, focusing instead on industry applicability.

3. The Value Students Place on Practical Learning Opportunities Provided by Professional Bodies

Hands-on learning and real-world problem-solving were highly valued by students. They expressed strong interest in applying their knowledge to authentic business contexts, such as running live social media campaigns or analysing and using real data to make marketing decisions.

4. Desire for More Practice-Based Digital Marketing Content

Although university rankings influenced their decision to enrol, students voiced a strong desire for more practical content throughout their studies. While some modules were praised for incorporating practical elements effectively, students felt this was inconsistent across the programme. They suggested clearer signposting of how modules align with the activities and standards of the DMA. Currently, they felt this alignment was disjointed and not clearly integrated into the curriculum.

5. Opportunities for Increased Face-to-Face Engagement with the Professional Body

Students expressed a strong preference for meeting representatives from the professional body in person. Due to resource and scheduling constraints, current engagement primarily includes pre-recorded videos and university tutor updates. While students appreciated these efforts, they strongly preferred live interaction, either through in-person sessions or live webinars, believing this would significantly enhance their experience and connection with the professional body.

6. The Importance of Assessments Being Aligned with Industry-Relevant Practices

Students emphasised that their primary focus during their studies is on assessments. While they acknowledged the existence of extracurricular activities, their priority remains academic achievement. This highlights the need to design assessments that incorporate practical, real-world elements in ways that also align with academic objectives. Ensuring industry relevance within assessments is key to maximising student engagement and employability outcomes.

Conclusion

Taking this research forward has led to the dissemination of findings with the professional body, and it will also be shared with other university tutors via DMA. To improve our MSc Digital Marketing programme, the following steps are recommended:

Internal Programme Improvements

  • Create a student-friendly roadmap for the MSc Digital Marketing programme that clearly signposts where and how professional skills are integrated across the course.
  • Host an annual visit from the DMA at the start of each academic year, ideally face-to-face, to aid student engagement and awareness.

Recommendations for the Professional Body

  • Introduce a student-led session, where alumni share their experiences of working with the DMA (e.g., via Creative Data Studio, summer schools, or student competitions). This could be delivered cost-effectively through an online format and opened to all universities.
  • Develop an annual hands-on session with Creative Data Studio in the North East, facilitated by the DMA. This would offer students across North East universities the opportunity to work directly with data and a real-life organisation.
  • Enhance the DMA Student Competition (a national level) by including additional data in the case study brief. For example, insights from previous campaigns or data on tools such as social media platforms.

References

Dražeta, L. (2023). Education reimagined: ey badges and degrees. FINIZ 2023-Sustainable development as a measure of modern business success, 109-113.

Izuegbu, V. E. (2007). Students as designers of their own life curricula: reconstruction of experience in education through thoughtful deliberative action. Journal of Thought42(3-4), 39-53.

King, N. (1998). Template analysis.

Pepple, D. G., Akaighe, G. O., Sambo, A., George-Aremu, O., Bosah, G., & Trollman, H. (2025). Using guest lectures to enhance student employability: pedagogical considerations. Cogent Education12(1), 2452076.

Ruslin, R., Mashuri, S., Rasak, M. S. A., Alhabsyi, F., & Syam, H. (2022). Semi-structured Interview: A methodological reflection on the development of a qualitative research instrument in educational studies. IOSR Journal of Research & Method in Education (IOSR-JRME)12(1), 22-29.

Ulseth, R. R., & Johnson, B. (2017). Self-directed learning development in PBL engineering students. International Journal of Engineering Education33(3), 1018–1030.

Categories
Uncategorised

The Unfairness of AI-Flagged Academic Misconduct Investigations in UK Universities

Dr David Grundy, Director of Digital Education, Newcastle University Business School

image illustrating the unfairness of academic misconduct allegations

Abstract

The increasing reliance on AI text detectors in UK higher education to flag potential academic misconduct raises profound fairness concerns. When universities initiate investigations solely based on an AI “red flag,” they risk contaminating the entire process, violating principles of due process and natural justice. Using the Office of the Independent Adjudicator’s (OIA) Good Practice Framework as a benchmark, this article examines how an initial AI suspicion can taint investigations (“fruit of the poisoned tree”), the high rate of false positives that undermines “just cause,” cognitive biases introduced when investigators know about AI flags, and the legal and procedural standards demanding evidence-based, impartial processes. Policy recommendations are offered to ensure AI detection tools serve only as preliminary aids, not as sole grounds for disciplinary action.

Keywords: AI Detection, Academic Misconduct Investigations, Unfair Processes, False Positives, Confirmation Bias, Anchoring Bias, Good Practice Framework

The “Fruit of the Poisoned Tree”

Originating in U.S. criminal law, the “fruit of the poisoned tree” doctrine holds that evidence derived from tainted sources is inadmissible. In UK academia, an AI detection flag functions as a “poisoned tree” when it is inherently unreliable. Initiating a misconduct investigation based solely on such a flag means the process is contaminated at its root, predisposing investigators to find guilt irrespective of subsequent evidence (Mita, 2023). Although UK courts do not formally adopt this doctrine, its underlying principle is embedded in the OIA’s mandate that disciplinary processes be fair and evidence-based (OIA, 2024). An AI-generated suspicion is akin to an unsubstantiated tip-off rather than concrete proof; using it as the trigger for formal inquiries unfairly prejudices the student, as any exculpatory evidence may be discounted through a lens of presumed guilt. To uphold justice, universities must treat AI flags as prompts for further verification, not as grounds for launching full-scale investigations.

False Positives and Lack of “Just Cause” for Investigation

A cornerstone of fairness in academic discipline is “just cause”—reasonable grounds to suspect misconduct. Generative AI detectors, however, exhibit significant error rates (Dalalah & Dalalah, 2023; Giray, 2024). Turnitin’s own data reveal a 4% sentence-level false-positive rate (Turnitin, 2023), implying that a sizeable fraction of human-written work could be mislabelled. Walters (2023) estimates that even a 10% false-positive rate could wrongfully implicate dozens of students per cohort, accumulating to hundreds over multiple years. Empirical assessments of 14 detectors report accuracies ranging from merely 33% to 81% (Weber-Wulff et al., 2023), and Temple University’s evaluation found Turnitin’s detector only 77% accurate at spotting AI text, with a 7% mis-flag rate for genuine writing (Asselta Law, 2025). The University of Pennsylvania concurs that many detectors have “dangerously high” false-positive defaults (Wood, 2024). Moreover, bias against non-native English writers exacerbates injustices: Liang et al. (2023) demonstrate that GPT detectors disproportionately target second-language students. An AI flag, therefore, falls far short of providing reasonable suspicion; it resembles unreliable hearsay rather than “hard evidence” required to justify formal proceedings (OIA, 2024). Without corroborating indicators—such as verbatim matches to external sources or inability to reproduce the work—launching investigations on AI scores alone constitutes an unjustified witch-hunt, subjecting innocent students to undue stress and reputational harm (Gorichanaz, 2023).

Cognitive Bias and Presumption of Guilt in AI-Flagged Cases

Once a case is initiated on the basis of an AI flag, investigators become vulnerable to cognitive biases. Confirmation bias leads them to seek out evidence that confirms the initial AI suspicion while overlooking exculpatory signs (Rassin, 2022; Wallace, 2015). Anchoring bias further cements this effect: a reported “85% AI-generated” score becomes an immovable reference point, skewing all subsequent evaluations (Ly et al., 2023). Forensic research shows that contextual suggestions of guilt “cannot be unseen,” distorting experts’ judgments even after the context is removed (Kunkler & Roy, 2023). In academic settings, knowing an essay was flagged primes investigators to interpret well-crafted sections suspiciously (“too good to be the student’s own”), rather than neutrally assessing content and process (OIA, 2024). Instructors may unconsciously scrutinize minor stylistic deviations or benign editing-tool usage as signs of cheating, while ignoring personal reflections or draft submissions that demonstrate authorship. This is particularly harmful for international students, whose strong writing skills can be misattributed to AI (Mathewson, 2023). By injecting a presumption of guilt and tainting the investigator’s mindset, AI flags undermine the “innocent until proven guilty” ethos and violate natural justice standards that forbid “any reasonable perception of bias or pre-determination” (OIA, 2024).

Due Process, Fairness, and Evidence

UK universities operate under internal regulations and public law requiring fairness, impartiality, and evidence-based decisions (OIA, 2024). Students are entitled to know precise allegations and the evidence against them, and to respond fully. AI detectors, however, provide opaque probability scores with no transparent rationale, denying students a meaningful opportunity to challenge the “evidence” (Turnitin, 2023). The typical “balance of probabilities” standard demands substantive proof that misconduct more likely than not occurred—an unsubstantiated AI score cannot meet this threshold. In appeals to the OIA, universities would struggle to justify decisions hinging on black-box algorithms rather than verifiable facts. Furthermore, evidence obtained through improper means—such as uploading student work to unauthorized free detection tools—may violate GDPR and intellectual property rights, rendering it inadmissible. Courts will intervene if procedural fairness or contract terms are breached; a case built chiefly on AI probabilities risks being overturned as procedurally unfair (OIA, 2024). To satisfy due process, any allegation of AI-assisted plagiarism should be substantiated with concrete examples—verbatim matches, inability to reproduce work, or clear stylistic mismatches informed by multiple writing samples—and accompanied by full disclosure of tool limitations to the student.

Policy Recommendations

Thresholds for “Just Cause”

AI detection results must not be the sole trigger for investigations. Universities should require corroborating evidence—verbatim text matches, stark deviations from the student’s known writing style, or inability to demonstrate authorship—before proceeding. Policies should explicitly state that AI flags serve only as preliminary guidance (JISC, 2023; Wargo & Anderson, 2024).

Human Oversight and Professional Scepticism

Flagged submissions should prompt human review by trained subject-matter experts or integrity officers. Reviewers must consider benign explanations (talent, grammar tools, personal reflections) and treat AI outputs as invitations for inquiry, not as proof (Jones & Newton, 2024).

Process Design to Mitigate Bias

Implement partial blinding: require markers to document independent concerns before viewing AI reports, or assign AI review to separate officers. Deliver training on cognitive biases in misconduct investigations, using case studies to highlight the risks of anchoring and confirmation bias (Born, 2024).

Transparent, Fair Regulations

Update academic integrity policies to affirm that no student may be penalized based solely on an AI detector. Incorporate explicit language: “Automated AI detection scores are preliminary aids only; findings must rest on verifiable evidence and academic judgment.” Disclose tool limitations and provide students access to the AI report and its accuracy parameters.

Assessment Design and Pedagogy

Shift away from punitive, detector-centric approaches toward authentic, iterative assessments (vivas, personalized tasks, draft-based assignments) that inherently discourage misconduct. Emphasize trust-based evaluation and support systems, reducing the institutional reliance on unreliable detection software (OIA, 2024).

Statement: Acknowledgement of Assistive Tool Usage

This document is the authors original own work, however Microsoft Word and Grammarly drafting and editing tools were used in the creation of the document as the author is dyslexic and the work would be unreadable without their usage to refine the text. Both tools incorporate elements of machine learning and AI. The author takes full responsibility for the content of the published article.

References

Asselta Law. (2025, February 16). The Hysteria of Professors Accusing Students of Using AI. https://www.asseltalaw.com/blog/2025/02/the-hysteria-of-professors-accusing-students-of-using-ai/

Baker, R. S., & Hawn, A. (2022). Algorithmic Bias in Education. International Journal of Artificial Intelligence in Education, 32(4), 1052–1092. https://doi.org/10.1007/s40593-021-00285-9

Born, R. T. (2024). Stop Fooling Yourself! (Diagnosing and Treating Confirmation Bias). eNeuro, 11(10). https://doi.org/10.1523/ENEURO.0415-24.2024

Chechitelli, A. (2023, June 14). Understanding the false positive rate for sentences of our AI writing detection capability. Turnitin Blog. https://www.turnitin.com/blog/understanding-the-false-positive-rate-for-sentences-of-our-ai-writing-detection-capability

Cambridge University (2023, October). Investigating academic misconduct and mark checks [Text]. https://www.studentcomplaints.admin.cam.ac.uk/staff-support/investigating-academic-misconduct-and-mark-checks

Dalalah, D., & Dalalah, O. M. A. (2023). The false positives and false negatives of generative AI detection tools in education and academic research: The case of ChatGPT. The International Journal of Management Education, 21(2), 100822. https://doi.org/10.1016/j.ijme.2023.100822

Eaton, S. E. (2022, February 22). Check your bias at the door. University Affairs. https://universityaffairs.ca/features/check-your-bias-at-the-door/

Foltynek, T., Bjelobaba, S., Glendinning, I., Khan, Z. R., Santos, R., Pavletic, P., & Kravjar, J. (2023). ENAI Recommendations on the ethical use of Artificial Intelligence in Education. International Journal for Educational Integrity, 19(1), 12. https://doi.org/10.1007/s40979-023-00133-4

Giray, L. (2024). The Problem with False Positives: AI Detection Unfairly Accuses Scholars of AI Plagiarism. The Serials Librarian, 85(5–6), 181–189. https://doi.org/10.1080/0361526X.2024.2433256

Good Practice Framework—OIAHE (Worldwide). (2024, June 3). https://www.oiahe.org.uk/resources-and-publications/good-practice-framework/

Gorichanaz, T. (2023). Accused: How students respond to allegations of using ChatGPT on assessments. Learning: Research and Practice, 9(2), 183–196. https://doi.org/10.1080/23735082.2023.2254787

Kunkler, K. S., & Roy, T. (2023). Reducing the impact of cognitive bias in decision making: Practical actions for forensic science practitioners. Forensic Science International: Synergy, 7, 100341. https://doi.org/10.1016/j.fsisyn.2023.100341

Liang, W., Yuksekgonul, M., Mao, Y., Wu, E., & Zou, J. (2023). GPT detectors are biased against non-native English writers. Patterns, 4(7). https://doi.org/10.1016/j.patter.2023.100779

Ly, D. P., Shekelle, P. G., & Song, Z. (2023). Evidence for Anchoring Bias During Physician Decision-Making. JAMA Internal Medicine, 183(8), 818–823. https://doi.org/10.1001/jamainternmed.2023.2366

Mathewson, T. G. (2023, August 14). AI Detection Tools Falsely Accuse International Students of Cheating – The Markup. https://themarkup.org/machine-learning/2023/08/14/ai-detection-tools-falsely-accuse-international-students-of-cheating

Mita, S. (2022). AI Proctoring: Academic Integrity vs. Student Rights Notes. Hastings Law Journal, 74(5), [i]-1554.

Newton, P. M., & Jones, S. (2025). Education and Training Assessment and Artificial Intelligence. A Pragmatic Guide for Educators. British Journal of Biomedical Science, 81, 14049. https://doi.org/10.3389/bjbs.2024.14049

Rassin, E. (2022). ‘Anyone who commits such a cruel crime, must be criminally irresponsible’: Context effects in forensic psychological assessment. Psychiatry, Psychology and Law, 29(4), 506–515. https://doi.org/10.1080/13218719.2021.1938272

Sadasivan, V. S., Kumar, A., Balasubramanian, S., Wang, W., & Feizi, S. (2025). Can AI-Generated Text be Reliably Detected? (No. arXiv:2303.11156). arXiv. https://doi.org/10.48550/arXiv.2303.11156

Wallace, W. A. (2015). The Effect of Confirmation Bias in Criminal Investigative Decision Making [Ph.D., Walden University]. In ProQuest Dissertations and Theses. https://www.proquest.com/docview/1668379477/abstract/576B938495004949PQ/1

Walters, W. H. (2023). The Effectiveness of Software Designed to Detect AI-Generated Writing: A Comparison of 16 AI Text Detectors. Open Information Science, 7(1). https://doi.org/10.1515/opis-2022-0158

Wargo, K., & Anderson, B. (n.d.). Striking a Balance: Navigating the Ethical Dilemmas of AI in Higher Education. EDUCAUSE Review. Retrieved 28 February 2025, from https://er.educause.edu/articles/2024/12/striking-a-balance-navigating-the-ethical-dilemmas-of-ai-in-higher-education

Webb, M. (2023, September 18). AI Detection—Latest Recommendations. Artificial Intelligence. https://nationalcentreforai.jiscinvolve.org/wp/2023/09/18/ai-detection-latest-recommendations/

Weber-Wulff, D., Anohina-Naumeca, A., Bjelobaba, S., Foltýnek, T., Guerrero-Dib, J., Popoola, O., Šigut, P., & Waddington, L. (2023). Testing of detection tools for AI-generated text. International Journal for Educational Integrity, 19(1), 26. https://doi.org/10.1007/s40979-023-00146-z

Wood, C. (2024, September 10). AI detectors are easily fooled, researchers find. EdScoop. https://edscoop.com/ai-detectors-are-easily-fooled-researchers-find/

Categories
Uncategorised

Embedding AI in Entrepreneurship Education: Practice Insights from a Lean Innovation Classroom Co-Creation for Engagement and Digital Literacy

Dr Elaine Tan, Senior Lecturer, Newcastle University Business School

This article presents a practical approach to embedding artificial intelligence (AI) tools within entrepreneurship education using a lean innovation module as a structured learning environment. Designed for final-year undergraduates, the module introduces students to generative AI as a shared resource and tool for use in the entrepreneurial process, supporting core tasks such as ideation, market research, customer insight development, and business model design. Co-creation was adopting within this module to undertake the shared endeavour of exploring AI’s potential, an approach advocated by Selwyn (2022) to support navigation of the emerging phenomena.

Co-Design to navigate ‘messiness’ of experimentation

The decision to employ a co-creation approach stemmed from recognising students’ established use of AI and seeking to harness their existing knowledge and expertise to foster engagement and collaboration. Initially, a frank discussion with students regarding their AI practices shaped the module’s design. We agreed that, mirroring wider student behaviours (as highlighted in a recent Higher Education Policy Institute report by Freeman, 2025), students would likely utilise AI tools for assignments. Rather than restricting this, we embraced the opportunity to guide them in effectively leveraging these tools as an integrated part of their learning. This approach would encourage students to use AI to support various tasks in appropriate ways, while simultaneously promoting transparency and collaborative sharing of outputs with peers and the module leader for critical evaluation. We (the students and staff) also recognised that, despite the increasing prevalence of AI, it remains a relatively new tool with; potentially overstated claims about its capabilities (Selwyn, 2022), that there were limitations in terms of what data could be used with the platforms (e.g. no personal data should be entered), and that all outputs should be examined critically for bias and error. Therefore, exploring its practical application and realising its potentially would likely be experimental, iterative and messy (Luckin & Holmes, 2016). In the spirit of co-creation, the expertise of students themselves as existing users of the tools was also acknowledged (Cook-Sather, Healey, and Matthews, 2021), and their input and feedback on the running of the module as a whole-class co-creation (Bovill, 2020) was both valued and appreciated.

What did students do?

Throughout sessions, students engaged in a variety of hands-on activities, utilising AI as a creative and critical tool. For example, they employed AI chat agents to brainstorm innovative ideas, developed detailed customer personas through AI-driven interviews. The students also collaboratively built value proposition and comprehensive business model canvases using AI that they then critically inspected and discussed. Whilst students had freedom to explore how AI tools could be used in the innovation process, these activities were carefully scaffolded with facilitative classroom prompts to ensure that outputs evaluated, challenged and iteratively improved.

The module’s core approach centered on empowering students to actively explore and experiment with AI, transforming them from passive learners into partners in the innovation process. One interesting outcome of the approach and use of AI was that, because ‘wrong’ answers were often attributed to the AI, rather than a student’s perceived misjudgement, removed much of the anxiety associated with presenting ideas (Cooper, Downing, and Brownell, 2018). Students were more inclined to contribute to class discussions as ‘their’ ideas were not being critiqued but presented as a neutral starting point for shared exploration. They also were able to use AI to create a starting point avoiding ‘blank page syndrome’ and speed up the initial stages of exploration. Instead, conversations focused on what could be done to improve on the outputs, what details were missing, and the quality and was highly experimental. Students were then tasked to improve on the outputs based on their conversation and provide more detailed prompts to support the development. The process fostered a more confident and collaborative environment, where students were comfortable iterating and building upon each other’s insights. The goal was always to test and refine concepts rather than a formal critique of an individual’s thinking.

When exploring MVPs the co-creation approach of the module paid dividends as students sought to further contribute and shape the exercises generating tangible products for experimentation. The independence assigned to students to explore and contribute had allowed them to develop their own awareness of tools, and at this stage in the module, they had begun to explore a wider range. This exploration extended beyond initial tools like ChatGPT, driven by a desire for more reliable outputs and a broader range of capabilities. Students uncovered and shared valuable resources, fostering a dynamic, collaborative learning environment. In their explorations they identified from the wealth of available tools which they were able to use to quickly and easily build suitable MVPs for exploration. These took the form of mock landing pages, wireframe apps, explainer videos, all quickly and easily created with AI. Students in class shared amongst the group the resources as they were discovered, passing comment as to how they had found them and how they rated them, which created an extremely interactive and engaging session for both students and staff.

The eventual assessment of the module was a collation of the outputs of the exercises alongside a reflective commentary demonstrating the development of their entrepreneurial idea. The final section of the assessment was a reflection on the process itself, what they had learned about both innovation and about themselves as potential entrepreneurs.

Concluding comments

Delivered through an experiential and co-creative framework, the module positioned AI as a shared tool to help students iterate on their ideas, interrogate assumptions, and simulate early-stage engagement with customers. No assertion of ownership of the tools by either staff or students was made. This contributed to the open and exploratory nature of the module and encouraged students to interact and contribute.

Key outcomes from this approach include improved student engagement, deeper reflection, and accelerated student progress through early-stage innovation activities. Students reported greater confidence in articulating business ideas, supported by their ability to use AI to generate options, test scenarios, and reframe problems. They also exhibited greater caution in uncritically accepting the outputs generated, indicating an increased AI literacy. Students reported becoming increasingly aware of bias and the tendency for tools to simply agree with any idea, no matter how incorrect or implausible, rather than present challenge. From a module leader’s perspective, the facilitated integration of the tools in classrooms was highly effective in promoting engagement and interaction. It created a less hierarchical environment with students more inclined to contribute and actively participate.

A critical concern emerging about student use of AI is that the continued use of the tools is detracting from creativity (Runco, 2023) and student ability to think critically (Kim et al., 2024). Reflecting on the activities of this module, these are fully justified concerns. At the beginning, students were indeed inclined to take the first output of the tool without question. Students saw the platforms more as a “service” for generating answers, similar to Google or a search engine, rather than as a tool to support them in developing their own thinking. Careful facilitation of the use of the tools, with reference to the module content, and the creation of a culture of candour about their adoption to support the development of literacy (Tan, 2013), was an effective way of prompting students to continue to engage with learning independently while still reaping the benefits of the affordances of the tools.

What is becoming evident and uncomfortable is that the conversation regarding use of AI tools has now shifted the dial when exploring assessment. The use of AI tools by students can now be taken as a near-universal certainty and has created what Song (2024) describes as a crisis. In recent assessment sessions, staff have witnessed a step change in how students are completing their work, with the use of AI now commonplace and unprecedented numbers of academic misconduct cases reported (Goodier, 2025). Educators are now challenged with addressing these conditions, with many reverting to traditional exam conditions to avoid uncertainty of academic misconduct. This practice-led model offers some insights into one way of integrating AI without undermining academic integrity or reducing learning depth. The module context and content of Lean Innovation, a process of experimentation, learning by failure, and exploration, lent themselves well to this approach. However, the possibility of embedding AI within a pedagogic framework that values iteration, reflection, and co-creation is not limited to this topic. Educators can equip students with future-facing skills while maintaining focus on mindset development, creativity, and critical thinking.

Bovill, C., 2020. Co-creation in learning and teaching: the case for a whole-class approach in higher education. Higher education, 79(6), pp.1023-1037.

Cook-Sather, A., Healey, M. and Matthews, K.E., 2021. Recognizing students’ expertise and insights in expanding forms of academic writing and publishing about learning and teaching. International Journal for Students as Partners, 5(1), pp.1-7.

Cooper, K.M., Downing, V.R. and Brownell, S.E., 2018. The influence of active learning practices on student anxiety in large-enrollment college science classrooms. International Journal of STEM education5, pp.1-18.

Freeman, J., 2025. Student generative ai survey 2025. Higher Education Policy Institute: London, UK.

Goodier, M. 2025. Revealed: Thousands of UK university students caught cheating using AI, The Guardian. Available at https://www.theguardian.com/education/2025/jun/15/thousands-of-uk-university-students-caught-cheating-using-ai-artificial-intelligence-survey (Accessed: 20 June 2025).

Luckin, R. and Holmes, W. 2016. Intelligence Unleashed: An argument for AI in Education. UCL Knowledge Lab: London, UK.

Kim, J., Kelly, S., Colón, A.X., Spence, P.R. and Lin, X., 2024. Toward thoughtful integration of AI in education: mitigating uncritical positivity and dependence on ChatGPT via classroom discussions. Communication Education73(4), pp.388-404.

Runco, M.A., 2023. AI can only produce artificial creativity. Journal of Creativity, 33(3), p.100063.

Selwyn, N., 2022. The future of AI and education: Some cautionary notes. European Journal of Education, 57(4), pp.620-631.

Song, N., 2024. Higher education crisis: Academic misconduct with generative AI. Journal of Contingencies and Crisis Management, 32(1), p.e12532.

Tan, E., 2013. Informal learning on YouTube: Exploring digital literacy in independent online learning. Learning, media and technology, 38(4), pp.463-477.

Zulfiqar, S., Sarwar, B., Huo, C., Zhao, X. and ul Mahasbi, H., 2025. AI-powered education: Driving entrepreneurial spirit among university students. The International Journal of Management Education, 23(2), p.101106.