Of AI in HE raises several concerns among international students. AI systems rely on vast amounts of personal data to function effectively, and the input data is collected, stored and used, which indicates the possibility of breaches of privacy and misuse of sensitive information (Binns et al., 2018). However, the awareness of this risk regarding personal data is not the consensus among international students. One of the interviewed undergraduate international students commented: ‘I never thought how ChatGPT uses my data, and no one told me about it, so there probably wouldn’t be much harm.’ This lack of awareness on data security reflects the urgent need to elevate digital literacy for international students in the era of AI development.
AI systems are only as good as the data they are trained on (West et al., 2019). So if the training data lacks diversity or contains inherent biases, such as historical or sampling biases, the AI outputs may perpetuate these australia consumer email list biases, disadvantaging certain student groups, including international students. For example, if the training data reflects historical inequalities and prejudices, the AI system will likely perpetuate those biases. An AI system trained on historical hiring data might exhibit gender or racial biases if past hiring practices were discriminatory. This bias can manifest in various ways, such as misinterpreting cultural nuances or unfairly evaluating student performance.
Despite its potential, the integration
-
- Posts: 316
- Joined: Thu Jan 16, 2025 8:32 am