<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Publications | Shawn Ogunseye, PhD</title>
	<atom:link href="https://ogunseye.com/category/papers/feed/" rel="self" type="application/rss+xml" />
	<link>https://ogunseye.com</link>
	<description>Building trustworthy systems where data, people, and purpose align</description>
	<lastBuildDate>Sat, 19 Jul 2025 18:36:47 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>Stop Training Your Competitor’s AI</title>
		<link>https://ogunseye.com/stop-training-your-competitors-ai/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=stop-training-your-competitors-ai</link>
		
		<dc:creator><![CDATA[Shawn Ogunseye]]></dc:creator>
		<pubDate>Sat, 19 Jul 2025 18:36:43 +0000</pubDate>
				<category><![CDATA[Publications]]></category>
		<guid isPermaLink="false">https://ogunseye.com/?p=2983</guid>

					<description><![CDATA[<p>When AI becomes part of the team rather than a private assistant, it becomes a platform for organizational intelligence. The most valuable conversations in your&#8230;</p>
The post <a href="https://ogunseye.com/stop-training-your-competitors-ai/">Stop Training Your Competitor’s AI</a> first appeared on <a href="https://ogunseye.com">Shawn Ogunseye, PhD</a>.]]></description>
		
		
		
			</item>
		<item>
		<title>They Can Include AI, But Should They?</title>
		<link>https://ogunseye.com/teaching-students-about-sensible-solutions-in-the-age-of-ai-hype/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=teaching-students-about-sensible-solutions-in-the-age-of-ai-hype</link>
		
		<dc:creator><![CDATA[Shawn Ogunseye]]></dc:creator>
		<pubDate>Sat, 19 Jul 2025 17:40:11 +0000</pubDate>
				<category><![CDATA[Business Management]]></category>
		<category><![CDATA[Publications]]></category>
		<guid isPermaLink="false">https://ogunseye.com/?p=2964</guid>

					<description><![CDATA[<p>I’ve come to believe the most valuable skill we can teach in technology education isn’t how to implement something. It’s how to decide whether something&#8230;</p>
The post <a href="https://ogunseye.com/teaching-students-about-sensible-solutions-in-the-age-of-ai-hype/">They Can Include AI, But Should They?</a> first appeared on <a href="https://ogunseye.com">Shawn Ogunseye, PhD</a>.]]></description>
		
		
		
			</item>
		<item>
		<title>An exploratory study of the critical factors affecting the acceptability of e-learning in Nigerian universities</title>
		<link>https://ogunseye.com/an-exploratory-study-of-the-critical-factors-affecting-the-acceptability-of-e-learning-in-nigerian-universities/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=an-exploratory-study-of-the-critical-factors-affecting-the-acceptability-of-e-learning-in-nigerian-universities</link>
		
		<dc:creator><![CDATA[Shawn Ogunseye]]></dc:creator>
		<pubDate>Fri, 21 Aug 2020 02:10:07 +0000</pubDate>
				<category><![CDATA[Publications]]></category>
		<guid isPermaLink="false">https://shawnogunseye.net/?p=353</guid>

					<description><![CDATA[<p>This paper highlights the factors that will affect the use of e-learning in developing countries in the COVID19 era.</p>
The post <a href="https://ogunseye.com/an-exploratory-study-of-the-critical-factors-affecting-the-acceptability-of-e-learning-in-nigerian-universities/">An exploratory study of the critical factors affecting the acceptability of e-learning in Nigerian universities</a> first appeared on <a href="https://ogunseye.com">Shawn Ogunseye, PhD</a>.]]></description>
		
		
		
			</item>
		<item>
		<title>Crowdsourcing for Repurposable Data: What We Lose When We Train Our Crowds</title>
		<link>https://ogunseye.com/crowdsourcing_for_repurposable_data/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=crowdsourcing_for_repurposable_data</link>
		
		<dc:creator><![CDATA[Shawn Ogunseye]]></dc:creator>
		<pubDate>Wed, 22 Jul 2020 14:23:00 +0000</pubDate>
				<category><![CDATA[Publications]]></category>
		<category><![CDATA[Citizen Science]]></category>
		<category><![CDATA[Crowdsourcing]]></category>
		<category><![CDATA[Data Quality]]></category>
		<category><![CDATA[Information Quality]]></category>
		<guid isPermaLink="false">https://shawnogunseye.net/?p=177</guid>

					<description><![CDATA[]]></description>
		
		
		
			</item>
		<item>
		<title>Designing for Information Quality in the Era of Repurposable Crowdsourced User-Generated Content</title>
		<link>https://ogunseye.com/designing-for-information-quality-in-the-era-of-repurposable-crowdsourced-user-generated-content/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=designing-for-information-quality-in-the-era-of-repurposable-crowdsourced-user-generated-content</link>
		
		<dc:creator><![CDATA[Shawn Ogunseye]]></dc:creator>
		<pubDate>Wed, 15 Jul 2020 06:54:00 +0000</pubDate>
				<category><![CDATA[Publications]]></category>
		<category><![CDATA[Citizen Science]]></category>
		<category><![CDATA[Crowdsourcing]]></category>
		<category><![CDATA[Data Quality]]></category>
		<category><![CDATA[Information Quality]]></category>
		<guid isPermaLink="false">https://shawnogunseye.net/?p=239</guid>

					<description><![CDATA[<p>Conventional wisdom holds that expert contributors provide higher quality user-generated content (UGC) than novices. Using the cognitive construct of selective attention, we argue that this may not be the case in some crowd-sourcing UGC applications. We argue that crowdsourcing systems that seek participation mainly from contributors who are experienced or have high levels of proficiency in the crowdsourcing task will gather less diverse and therefore less repurposable data. We discuss the importance of the information diversity dimension of information quality for the use and repurposing of UGC and provide a theoretical basis for our position, with the goal of stimulating empirical research.</p>
The post <a href="https://ogunseye.com/designing-for-information-quality-in-the-era-of-repurposable-crowdsourced-user-generated-content/">Designing for Information Quality in the Era of Repurposable Crowdsourced User-Generated Content</a> first appeared on <a href="https://ogunseye.com">Shawn Ogunseye, PhD</a>.]]></description>
		
		
		
			</item>
		<item>
		<title>What Makes a Good Crowd? Rethinking the Relationship between Recruitment Strategies and Data Quality in Crowdsourcing</title>
		<link>https://ogunseye.com/what-makes-a-good-crowd-rethinking-the-relationship-between-recruitment-strategies-and-data-quality-in-crowdsourcing/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=what-makes-a-good-crowd-rethinking-the-relationship-between-recruitment-strategies-and-data-quality-in-crowdsourcing</link>
		
		<dc:creator><![CDATA[Shawn Ogunseye]]></dc:creator>
		<pubDate>Tue, 23 Jun 2020 07:15:41 +0000</pubDate>
				<category><![CDATA[Publications]]></category>
		<category><![CDATA[Citizen Science]]></category>
		<category><![CDATA[Crowdsourcing]]></category>
		<category><![CDATA[Data Quality]]></category>
		<category><![CDATA[Information Quality]]></category>
		<guid isPermaLink="false">https://shawnogunseye.net/?p=250</guid>

					<description><![CDATA[<p>Conventional wisdom dictates that the quality of data collected in a crowdsourcing project is positively related to how knowledgeable the contributors are. Consequently, numerous crowdsourcing projects implement crowd recruitment strategies that reflect this reasoning. In this paper, we explore the effect of crowd recruitment strategies on the quality of crowdsourced data using classification theory. As these strategies are based on knowledge, we consider how a contributor’s knowledge may affect the quality of data he or she provides. We also build on previous research by considering relevant dimensions of data quality beyond accuracy and predict the effects of available recruitment strategies on these dimensions of data  quality.</p>
The post <a href="https://ogunseye.com/what-makes-a-good-crowd-rethinking-the-relationship-between-recruitment-strategies-and-data-quality-in-crowdsourcing/">What Makes a Good Crowd? Rethinking the Relationship between Recruitment Strategies and Data Quality in Crowdsourcing</a> first appeared on <a href="https://ogunseye.com">Shawn Ogunseye, PhD</a>.]]></description>
		
		
		
			</item>
		<item>
		<title>Do Crowds Go Stale? Exploring the Effects of Crowd Reuse on Data Diversity</title>
		<link>https://ogunseye.com/do-crowds-go-stale-exploring-the-effects-of-crowd-reuse-on-data-diversity/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=do-crowds-go-stale-exploring-the-effects-of-crowd-reuse-on-data-diversity</link>
		
		<dc:creator><![CDATA[Shawn Ogunseye]]></dc:creator>
		<pubDate>Thu, 23 Jan 2020 07:03:22 +0000</pubDate>
				<category><![CDATA[Publications]]></category>
		<category><![CDATA[Citizen Science]]></category>
		<category><![CDATA[Crowdsourcing]]></category>
		<category><![CDATA[Data Crowdsourcing]]></category>
		<category><![CDATA[Data Quality]]></category>
		<category><![CDATA[Information Quality]]></category>
		<guid isPermaLink="false">https://shawnogunseye.net/?p=243</guid>

					<description><![CDATA[<p>Crowdsourcing is increasingly used to engage people to contribute data for a variety of purposes to support decision-making and analysis. A common assumption in many crowdsourcing projects is that experience leads to better contributions. In this research, we demonstrate limits of this assumption. We argue that greater experience in contributing to a crowdsourcing project can lead to a narrowing in the kind of data a contributor provides, causing a decrease in the diversity of data provided. We test this proposition using data from two sources-comments submitted with contributions in a citizen science crowdsourcing project, and three years of online product reviews. Our analysis of comments provided by contributors shows that the length of comments decreases as the number of contributions increases. Also, we find that the number of attributes reported by contributors decreases as they gain experience. These finding support our prediction, suggesting that the diversity of data provided by contributors declines over time.</p>
The post <a href="https://ogunseye.com/do-crowds-go-stale-exploring-the-effects-of-crowd-reuse-on-data-diversity/">Do Crowds Go Stale? Exploring the Effects of Crowd Reuse on Data Diversity</a> first appeared on <a href="https://ogunseye.com">Shawn Ogunseye, PhD</a>.]]></description>
		
		
		
			</item>
		<item>
		<title>Can Expertise Impair the Quality of Crowdsourced Data?</title>
		<link>https://ogunseye.com/can-expertise-impair-the-quality-of-crowdsourced-data/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=can-expertise-impair-the-quality-of-crowdsourced-data</link>
		
		<dc:creator><![CDATA[Shawn Ogunseye]]></dc:creator>
		<pubDate>Mon, 01 Jul 2019 07:23:00 +0000</pubDate>
				<category><![CDATA[Publications]]></category>
		<category><![CDATA[Citizen Science]]></category>
		<category><![CDATA[Crowdsourcing]]></category>
		<category><![CDATA[Data Quality]]></category>
		<category><![CDATA[Information Quality]]></category>
		<guid isPermaLink="false">https://shawnogunseye.net/?p=254</guid>

					<description><![CDATA[<p>It is not uncommon for projects that collect crowdsourced data to be commissioned with incomplete knowledge of data contributors, data consumers, and/or the purposes for which the data collected are going to be used. Such unanticipated uses and users of data form the basis for open information environments (OIEs), and the information collected through systems designed to gather content from users have high quality when they are complete, accurate, current and provided in an appropriate format. However, as it is assumed that experts provide higher quality information, many types of OIEs have been designed for experts. In this paper, we question the appropriateness of this assumption in the context of citizen science systems – an exemplary category of OIE. We begin by arguing that experts are primarily efficient rule-based classifiers, which implies that they selectively focus only on attributes relevant to their classification task and ignore others. Drawing from existing literature, we posit that experts’ focus on only diagnostic features of an entity leads to a learned inattention to non-diagnostic attributes. This may improve the accuracy of the information provided, but at the expense of its completeness, currency, format and ultimately the novelty (for unanticipated uses) of information provided. On the other hand, we predict that non-experts and amateurs may use rules to a lesser extent, resulting in less selective attention and leading them to provide more novel information with less trade-off of one dimension of information quality for another. We propose hypotheses derived from this view, and outline two experiments we have designed to test them across four dimensions of information quality. We conclude by discussing the potential implications of this work for the design of crowdsourcing platforms and the recruitment of experts, amateurs, or novice data contributors in studies of data quality in crowdsourcing settings.</p>
The post <a href="https://ogunseye.com/can-expertise-impair-the-quality-of-crowdsourced-data/">Can Expertise Impair the Quality of Crowdsourced Data?</a> first appeared on <a href="https://ogunseye.com">Shawn Ogunseye, PhD</a>.]]></description>
		
		
		
			</item>
	</channel>
</rss>
