-------- Forwarded Message --------
Track: Rhetoric, design, social media in (dis) information
processing
AMCIS 2019, the 25th Americas Conference on Information Systems,
August 15 - 17, 2019, Cancun, Mexico
Track Chairs:
Vishal Shah, Assistant Professor, Central Michigan University,
USA,
shah3v@cmich.edu<mailto:shah3@cmich.edu>
<mailto:shah3@cmich.edu>Carlo Gabriel Porto Bellini,
Associate Professor, Federal University of Paraiba, Brazil,
cgpbellini@ccsa.ufpb.br<mailto:cgpbellini@ccsa.ufpb.br>
Track Description:
Information processing has become increasingly cognitively
demanding as we are confronted with information that comes to us
in everyday contexts – everywhere, anytime, and anyplace –through
our mobile devices and social media connections. The processing of
information stems from the fundamental need to connect and be part
of the world around us (Maslow 1971). Despite our need for social
connection, information overload is a serious threat to our
capacity to process information and make good decisions based on
that information (Eppler & Mengis 2004). Accordingly, also at
risk is our expectation of being effective in the digital society
– i.e., of making use of technology-mediated information vis-à-vis
a purpose and in a systemically healthy way (Bellini 2018). In
this track, we focus primarily on the aspects that limit our
effectiveness to deal with information, and how such practices may
lead to bias and propaganda in society.
Bias refers to prejudice for or against a
personal/political/social issue despite the lack of objective
evidence to support it. As recent events throughout the world have
shown, social media platforms can be vehicles to promote false
narratives that amplify bias and influence public opinion. We have
almost 3.2 billion active social media users in a population of
7.5 billion individuals, of which 2.95 billion are active through
their mobile device (Hootsuite 2018). Given the rise in the volume
and sources of information and our limited cognitive capacity, the
situation is ripe for the spread of false information aka “fake
news” whether by misinformation or disinformation. By
misinformation, we mean false information. By disinformation, we
mean the deliberate spread of false information for malicious
purposes.
The ability of an individual, a collective, or state actors to use
platforms like social media to spread disinformation has
amplified, as evidenced recently in political campaigning and
elections (Marchi 2012; Allcott & Gentzcow 2016).
Misinformation is also relatively prevalent in the everyday use of
social media platforms like Facebook and Twitter. As recent
research (Lazer et al. 2018) points out, we need a new “system of
safeguards” and novel frameworks to approach this problem.
Our purpose in this track is to provide a forum for such
safeguards. We encourage papers that address the broad area of
information spread and technology use, and their effects in
biasing personal and/or political decision-making. This track
specifically encourages submissions of research exploring
innovative ways to identify the mechanisms and causes of spreading
disinformation and ways to deal with these mechanisms in the
context of rhetoric, design, and social media. We invite
submissions that elaborate causes and impacts of
disinformation/misinformation such as conceptual and theoretical
developments, empirical research findings, case studies, research
in progress, methodology papers, and other high-quality
contributions. Submissions detailing research on measures (either
theoretical measures or behavioral interventions, or the design of
novel artifacts) to prevent the spread of
misinformation/disinformation are also welcome.
Opportunities in Leading Journals:
Promising papers will be fast-tracked to BAR – Brazilian
Administration Review upon the authors’ consent. BAR is the
international flagship journal of the Brazilian Academy of
Management (ANPAD). It is indexed in Scopus.
Developed Mini-Tracks:
Mini-track 1: Rhetoric, technology, and disinformation
Contact: James Melton, Central Michigan University,
melto1jh@cmich.edu
This minitrack seeks to explore the relationship between rhetoric,
social media platforms, and disinformation. One of the ways to
deal with disinformation and to avoid exacerbating biases is to
have a general population trained in rhetoric. Because the
discipline of rhetoric studies the effects of persuasion on
audiences, it can help make those audiences more aware of
mechanisms of spreading disinformation. For example, recent papers
studied how to inoculate people against misinformation by asking
them to play roles such as “clickbait monger” seeking to get
clicks themselves or to act as “conspiracy theorist." It was found
that when made aware of the ease that misinformation could be
spread, people were more likely to be critical of it in the future
(Roozenbeek et al. 2018; van der Linden et al. 2017). Such
interventions demonstrate that rhetorical awareness of mechanisms
that enable the spread of disinformation can help combat bias
through awareness. We welcome papers at the intersection of
rhetoric, psychology, and information systems that attempt to
solve the problem of disinformation from an interdisciplinary
standpoint.
Mini-track 2: User experience, human-computer interaction, and
design of (dis)information
Contact: Gustav Verhulsdonck, Central Michigan University,
verhu1g@cmich.edu
This minitrack seeks papers at the intersection of User Experience
(UX) design, Human-Computer Interaction (HCI) and disinformation.
Design for user experiences is one way to tackle the problem of
disinformation. Today’s technological devices may promote the
engagement of a user by designers utilizing deep knowledge of the
user’s behavior and psychology (Choi & Kim 2004; Chou &
Ting 2003). Persuasive design and design for behavior motivate
users to stay longer on a platform by “gaming” their behavior or
decisions through the design of an interface (Fogg 2002; Lockton
et al. 2010). It can range from simplifying a design with a clear
call-to-action so that the user makes a purchase, coax them into
staying on the platform, or from deceptive practices where
threatening language is used to prevent users from opting in/out
of policies (aka “confirmshaming”). Often, design practices can
serve to clarify things for the user, but they may also utilize
disinformation and serve the underlying economic motive of the
platform. What mechanisms can help prevent disinformation from a
design point of view? Which design practices should UX designers
consider to counter disinformation and develop more transparent,
ethical design for users?
We encourage all types of papers dealing with the design of
disinformation exploring issues of agency, platforms, and design
in light of the challenges of user experience.
Mini-track 3: Social media and disinformation
Contact: Rishikesh Jena, University of Alabama,
rjena@cba.ua.edu
This minitrack seeks papers that elaborate and/or address the
underlying causes of disinformation through technological means.
Researchers have identified how false information is spread more
quickly, deeper, and further due to human nature accepting rumors
more quickly over truthful statements (Vosoughi, Roy & Aral
2018). The use of social technologies, which allow for quick
dissemination of information further encourages this dynamic by
offering strong user engagement but little to no context to users.
A balancing act is required in the use of these technologies
between mechanisms for disseminating information while allowing us
to check the validity of this information. Technological
developments (algorithms, big data, artificial intelligence,
Internet of Things and smart technologies) hold the promise of
combating misinformation. At the same time, artificial
intelligence, big data, and algorithms offer little to no access
to information that they make inferences about our online actions
that are often used to present advertisements or information to
us. In this track, we are therefore looking for research on the
diverse causes of misinformation/disinformation in social
technologies and a variety of ways that these technologies can
help us combat it.
The above minitracks covers a broad range of research (rhetorical,
psychological, behavioral, and technical/artifact-based) on
disinformation, the spreading mechanism(s) behind it, its
implications on political/social/personal decision making and
formation of biases, and ways to prevent disinformation from
entering everyday discourse. Interdisciplinary submissions are
also encouraged.
Further, we invite additional minitracks along the following
lines:
* Business models of internet companies and their relation to
disinformation/misinformation
* Political and societal impacts of disinformation/misinformation
* Efficacy of measures to counter disinformation/misinformation
* Social bots and spreading of propaganda
* Characteristics of potential disinformation/misinformation
* Policy frameworks to combat disinformation/misinformation
* Proactive and reactive disinformation
Minitrack chairs will be responsible for:
a) promoting their minitrack to generate manuscript submissions to
AMCIS 2019.
b) soliciting/assigning reviewers for submitted manuscripts to the
minitrack.
c) providing recommendations to track chairs about each manuscript
submitted to the minitrack.
To submit a minitrack proposal, you must submit:
a) minitrack chairs (names, emails, affiliation)
b) minitrack title
c) a short description of minitrack for the AMCIS 2019 website (up
to 150 words)
d) call for papers for your minitrack
Important Dates:
September 20, 2018: PCS opens for Minitrack submissions
October 19, 2018: Minitrack submissions are due
October 30, 2018: Minitrack decisions are complete
November 5, 2018: Minitrack revisions are due
January 7, 2019: Manuscript submissions for AMCIS 2019 begin
March 1, 2019: AMCIS manuscript submissions closes for authors at
10:00am PST
March 7, 2019: All papers have assigned reviewers
April 15, 2019: Track Chairs recommendations are due
April 24, 2019: Camera-ready papers are due
May 1, 2019: Track session plans are due
References
Allcott, H., & Gentzkow, M. (2017). Social media and fake news
in the 2016 election. Journal of Economic Perspectives, 31(2),
211-36.
Bellini, C.G.P. (2018). The ABCs of effectiveness in the digital
society. Communications of the ACM, 61(7), 84-91.
Choi, D., & Kim, J. (2004). Why people continue to play online
games: In search of critical design factors to increase customer
loyalty to online contents. CyberPsychology & Behavior, 7(1),
11-24.
Chou, Y.J., & Ting, C. C. (2003). The role of flow experience
in cyber-game addiction. CyberPsychology & Behavior, 6(6),
663-675.
Eppler, M.J., & Mengis, J. (2004). The concept of information
overload: A review of literature from organization science,
accounting, marketing, MIS, and related disciplines. The
Information Society, 20(5), 325-344.
Fogg, B.J. (2002). Persuasive technology: Using computers to
change what we think and do (interactive technologies). San
Francisco, CA: Morgan Kaufmann.
Hootsuite. (2018). Global digital snapshot. Retrieved March 23,
2018, from
https://wearesocial.com/blog/2018/01/global-digital-report-2018.
Lazer, D. M., Baum, M. A., Benkler, Y., Berinsky, A. J.,
Greenhill, K. M., et al. (2018). The science of fake news.
Science, 359(6380), 1094-1096.
Lockton, D., Harrison, D., & Stanton, N.A. (2010). Design with
intent: 101 patterns for influencing behaviour through design
v.1.0, Windsor: Equifine.
Marchi, R. (2012). With Facebook, blogs, and fake news, teens
reject journalistic “objectivity.” Journal of Communication
Inquiry, 36(3), 246-262.
Maslow, A. H. (1971). The farther reaches of human nature. London,
UK: Arkana/Penguin Books.
Roozenbeek, J., & van der Linden, S. (2018). The fake news
game: Actively inoculating against the risk of misinformation.
Journal of Risk Research, DOI: 10.1080/13669877.2018.1443491
van der Linden, S., Maibach, E., Cook, J., Leiserowitz, A., &
Lewandowsky, S. (2017). Inoculating against
<http://science.sciencemag.org/content/358/6367/1141.2>
misinformation
<http://science.sciencemag.org/content/358/6367/1141.2>.
Science, 358(6367), 1141-1142.
Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true
and false news online. Science, 2359, 1146-1151.
Thank you,
Vishal
Vishal Shah
Assistant Professor
Department of Business Information Systems
Central Michigan University
Grawn Hall - 336
Mount Pleasant, MI - 48859
Office: 989-774-4350
Email:
shah3v@cmich.edu<mailto:shah3v@cmich.edu>
_______________________________________________
AISWorld mailing list
AISWorld@lists.aisnet.org