Information professionals are involved in facilitating a variety of programs and services in their information organizations. Yet it is not enough to stay functional with the same programming and the same way of conducting service in one’s organization. This runs the risk of the organization becoming stagnant and irrelevant in ever-changing internal and external environments. It is the responsibility of information professionals to demonstrate the value of their work and make it apparent to their organization’s stakeholders that their services are meeting the needs of their users (Stenström, 2015), and adapt and change their processes as necessary. Assessment then is an integral aspect of the information professional’s work process, in evaluating programs and services that highlight their interaction and impact with their intended users.
To demonstrate the value of their work through assessment, information professionals need to define the purpose of their assessment strategy, use appropriate tools, and explore data collection options beyond the organization’s immediate users. There are different kinds of data that can be collected such as user satisfaction, economic impact factors, and social impact factors that, when appropriately assessed, can both inform an organization’s decision making in relation to its strategic direction and show the value of their organization’s work to its users (Stenström, 2015). This process involves using applicable tools for assessment, which involves making or tailoring existing tools to fit an organization’s specific needs, to collect both quantitative and qualitative data to measure specific and relevant outcomes. With the rise of big data, information professionals can leverage data collection strategies at a more granular level than ever before, for internal data such as resource usage and external data such as available national datasets. Yet all of these elements are only as good as the development of relevant assessment questions and the use of appropriate research methods and data to assess them, as each audience may need a different form of assessment or a different way to understand the same dataset to highlight the value from the assessment that is important to that audience (Stenström, 2015).
In pursing my MLIS, I have discussed the need for assessment in the context of information literacy (IL) instruction and measurement of the interaction and impact of information organizations with their users through social media activity. Using my research-related skillset as a point of reference, I applied the principles of assessment discussed above towards evaluating what elements across various games could translate practically into gamified instruction, developing prototype assessment instruments to evaluate both the games and the gamified instruction. I also learned about techniques in social network analysis (SNA) to analyze social media interactions, conducting my own SNA of a dataset of an information organization’s social media to draw conclusions about users interacting with that organization and offer recommendations for actionable steps that could be taken to generate a stronger social media presence. As an aspiring information professional, integrating assessment into my workflows, at individual and institutional levels, can help contribute to the overall strategic decision making and direction of the organization of my future workplace.
Discussion of Competency Supporting Evidence
Assessment is typically conducted with specific goals in mind that are aligned with an organization’s strategic goals and decision making. In the context of IL instruction, there is an increasing need for new assessment strategies in light of the new ACRL Framework for Information Literacy, which replaced the Standards in 2016. Compared to its predecessor, the Framework is now approaching IL learning and instruction from a more social perspective, moving from a skills-based perspective to an emphasis on the contextual nature of information (Foasberg, 2015). The ACRL presents the Framework as a “cluster of interconnected core concepts” that emphasizes knowledge practices and individual dispositions, shifting from a set of standards and enumeration of skills and tangible outcomes present in the former Standards (ACRL Board, 2015). Yet the discussion of the various “threshold” concepts in the Framework is arguably vague, and in the process of the ideological shift from Standards to Framework, strategies for assessment are lagging behind. The ambiguity of the Framework could be a cause for librarians and faculty to engage in discussion on how to best teach and assist students in developing IL. New assessment strategies and tools need to be created. This is why critical and novel ways of thinking are now needed when information professionals create and tailor their own assessment tools in alignment with the Framework. From my experience creating games in INFO 287, I propose that the principles and perspective of gamification can assist in the development of relevant and actionable assessment tools for IL instruction.
In the process of developing gamified instructional content in INFO 287: Gamifying Information (a Seminar in Information Science topic), I explored and assessed various types of games for their specific game mechanics. The purpose of this exercise was to find gameplay elements that could translate well into instructional games. From scavenger/treasure hunts, badge rewards, social offline games, and path games, I derived basic elements to build assessment tools to evaluate those games towards implementing usable and relevant elements into my own game builds. [See Competency K for more about my semester-long game building processes and further rationale for gamified psychology IL instruction.] I created a total of four assessment rubrics, each with a different purpose to explore a specific game mechanic and emphasis that related to the course’s topic of focus in the week’s assignments.
This final version of my assessment tool created near the end of the course is a comprehensive take on various important factors that administrators and other stakeholders can use in evaluation of instructional games such as the ones that I created, as well as in evaluation of other games to identify useful game elements to use in future gamified instruction projects. The tool focuses on five dimensions in the instruction and gamification of psychology IL: intrinsic motivation, difficulty, replayability, educational value, and alignment with ACRL Framework. These elements are based on work from Lepper (1988) on the four factors of control, challenge, curiosity, and contextualization that contribute to intrinsic motivation in learning, Kapp’s (2012) discussion of the use of game mechanics and the knowledge domains that game mechanics can help develop in students (categorized under difficulty, replayability, and educational value), and the ACRL (2015) Framework. Items are scored on a scale of one to five; there are also specific interpretations of total scores in each section in the document (generally, per item, one represents how the game aligns with an aspect poorly and five represents how the game aligns with an aspect well). This tool also has a section to include any narrative comments that expand upon the item scoring and notable features not captured by the items.
In my approach to create this assessment tool, I kept in focus the ambiguity of the Framework in relation to my instructional goals of the game I was creating and the goals of the institution that I chose to make the game for in a hypothetical instruction scenario. Building upon a predefined frame of reference and working within an institution’s explicit strategic goals, I saw that I learned about assessment strategies from evaluating games in a structured manner and with purpose and end goal in mind. I was able to develop relevant assessment items in light of the overarching goal of gamifying psychology IL instruction. I felt that I gained some novel insight from these evaluation processes that can assist in developing assessment strategies of IL instruction, which in turn can influence the way how I could approach assessment for more traditional programming and services of an information organization in the future.
Assessment can also be used towards an organization’s marketing strategy, particularly in the evaluation of the effectiveness of messaging across that organization’s social media presences. Social media is a virtual representation of the organic connections that exist among groups of people and things (Hansen, Shneiderman, & Smith, 2011). In terms of discussion of the measurement of an information organization’s social media impact for marketing and evaluation of services, it is possible to analyze these virtual, discrete relationships between users and their interactions with each other and with the organization in relation to measurable criteria. Metrics that can quantify and graphically represent these relationships come from social network analysis (SNA), the application of network science principles to the study of human relationships and connections that form social networks (Hansen et al., 2011). Information professionals can leverage knowledge of social networks and their functions to assess their current influence, interaction, and impact with their organization’s social media followers and generate evidence-based courses of action to respond to those assessment findings. I experienced this in INFO 282: Social Network Management and Social Analytics (a Seminar in Library Management topic), as I learned about how social networks form and function, their relation to social media, and how to analyze and evaluate the effectiveness of information organizations’ engagement with those networks.
For the culminating assignment of INFO 282, I had to develop a presentation directed at the leadership of an information organization, discussing my SNA of their social media presence. Further stipulations of this assignment were to discuss important SNA measures in a way that a non-expert would understand, provide an overview of major findings, and offer recommended, actionable next steps clearly based on the SNA. In this process, I learned how to use NodeXL, a third-party extension for MS Excel that allows users to conduct SNA by importing content from social media accounts from a given timeframe, generate SNA metrics, and create graph outputs of relatively large datasets.
As an avid video gamer on my spare time, I have interests in how others create networks around games and their content, made possible with the Internet as a space for both content consumption and content generation in response to that consumption. In terms of SNA, such connections are both directed and undirected: directed in the sense that content generators may post directly to game developers, thanking them or complaining to them about a particular game, as well as undirected, in which content exchanges occur between other users on sites such as Facebook or DeviantArt. I directed my efforts throughout the course in answering my research questions related to the game developer Guerrilla Games and users’ Twitter interactions with Guerrilla’s latest game Horizon Zero Dawn (HZD). I looked at the different kinds of connections across users with Guerrilla Games’ Twitter account @Guerrilla by searching for how the hashtags #horizonzerodawn and #hzd were used in discussion of the game, in terms of the associated hashtags that appeared alongside them and what additional multimedia content was included in those posts.
As seen in my simulated, recorded presentation to the creative leadership of Guerrilla Games, I discussed SNA basics in relation to the NodeXL-generated graphs of a dataset of Twitter interactions between March and April 2017 that I imported from the @Guerrilla feed. Each node on the graph is a “vertex,” representing a Twitter user. Each line extending from a vertex is an “edge,” representing a mention in a tweet or the recipient of a tweet. These edges are arrows, showing the direction of who receives the tweet. The number of edges between vertices shows how highly connected a Twitter user is based on their activity and others interacting with them via retweets, likes, and mentions. Based on the NodeXL metrics calculated, I found 20 highly connected users through SNA “in-degree” measures (@Guerrilla included), based on how many other users mention and/or tweet at them. I then ran a clustering algorithm to see what kinds of groups emerged in relation to these highly connected users, which resulted in three major clusters and numerous smaller ones, all connected to @Guerrilla via their HZD posts. These highly connected users are emphasized in the graph below with their Twitter profile pictures and names.
SNA @Guerrilla Users. Created with NodeXL. Click image to see full sized graph.
After going through the dataset and coding the content in each tweet into distinct multimedia categories (e.g., screenshot, fan art, video, cosplay), I generated the cluster graph below. For ease of viewing, I filtered the graph to only show users with “betweenness centrality” of 35 or higher (betweenness centrality is an SNA measure that highlights how far apart users are from each other by the amount of tweets that separate closely neighboring users).
SNA @Guerrilla Users. Created with NodeXL. Click image to see full sized graph.
From this graph, I reasoned that due to the nature of Twitter, it made sense that the majority of users posted in-game HZD screenshots. Yet there is also significant activity for fan art and cosplay, much of which are retweeted by many users. Guerrilla has the highest betweenness centrality in the network, meaning that users are constantly passing information to Guerrilla via mentions through many other users retweeting the same content. Yet I saw that Guerrilla was not always responding to them. This is good in terms of publicity (that users have @Guerrilla in HZD tweets), but bad in terms of establishing meaningful connections between Guerrilla and users and finding potentially useful content made by users in the network’s subclusters. What this means is that there are not too many fan art or cosplay related HZD content on Twitter to begin with, and for those that are posted, it can take longer for users interacting more closely with Guerrilla via screenshot tweets to find this information and retweet it so it reaches Guerrilla’s notice on Twitter.
With the SNA that I presented, I then proposed next steps the Guerrilla Games social media team could take to generate more interactions with its followers. I discussed making the ongoing #HZDPhotoMode contest more visible on social media, possible game promotion and collaboration with CDPROJEKTRED (another game developer studio and one of the highly connected users found in the graphs above), and reaching out to trending Twitter users for possible collaboration to increase publicity. These suggestions were made with the goal to take advantage of the fan content interest already present on Twitter to leverage towards more publicity for HZD and Guerrilla Games.
SNA focuses on the emergent patterns in relationships between, not within people, groups, or things, going beyond analysis of individual factors to look at the “connective tissue of societies and other complex interdependencies” that exist in networks (Hansen et al., 2011, p.32). For information professionals, the location of an information organization in a network is central to understanding how users interact with what is posted on social media, which in turn, has implications in how information professionals can leverage that data and conduct analyses to assess impact of marketing strategies. Establishing connections between an organization and its users are important, as well as the quality and directionality of those connections. SNA can uncover those network connections in a graphical form and, if understood well, can help information professionals see the network’s structure, where their information organization stands in its relationships, and even see the social roles that interacting users are filling in the network.
Simply put, I see that assessment for the information professional means evaluating with purpose. With this in mind, I can take these evaluation processes gained from these courses towards building relevant assessment tools for other instructional purposes and other areas of assessment in an information organization. I can see myself integrating assessment and evaluation as appropriate in my own future work processes in a possible instructional librarian/liaison position and also potentially assist the marketing efforts of my future workplace in evaluating the impact of the social media marketing strategy using SNA principles, all in working towards that organization’s strategic goals.
Evaluation of instruction is necessary, especially in light of the changes that came with the shift from the Standards to the Framework. IL instruction under the Framework is still transition, and it is necessary to develop new assessment strategies and tools to align with the Framework. Yet this is also an opportunity to reimagine the process as well. Working with a predefined frame of reference and in alignment with an institution’s goals, information processionals can find value in looking beyond traditional means of instructional assessment such as looking for relevant game elements that can translate well into gamified instruction. This in turn can generate novel ways to demonstrate value to an information organization’s stakeholders using a fresh perspective.
Evaluation of interaction and messaging impact on social media is also needed. Information professionals are in a position to see how social media is a viable and worthwhile avenue of assessment and data collection for their organizations. The networks existing offline extend out virtually in this ever-growing, ever-connected age of social media. SNA of social media focuses on these relationships, going beyond traditional participation statistics to uncover social phenomena such as group formation/clustering, group cohesion, social roles, influential users, and overall community patterns of relationship (Hansen et al., 2011). Information professionals can take advantage of these techniques in evaluation of their organization’s social media interactions and marketing strategies, and can have solid datasets to make their proposals for future directions more evidence-based.
Having a mindset for assessment and evaluation is a critical part of an information professional’s work. When done right, assessments can answer many questions for a variety of audiences, for both employees and administrators at an internal level, and users and key stakeholders at an external level. Assessment must be multifaceted, rigorous, and meaningful in demonstrating the value and impact of an information professional’s and organization’s work (Stenström, 2015). Also, the process of developing and refining assessment strategies is ongoing, changing with the ever-shifting information landscape and needs of an organization’s users. It is essential then that information professionals integrate assessment into their workflows, at individual and institutional levels, for important services like instruction and evaluation of marketing in terms of social media impact.
Hansen, D.L., Shneiderman, B., & Smith, M.A. (2011). Analyzing social media networks with NodeXL: Insights from a connected world. Burlington, Massachusetts: Morgan Kaufmann.
Kapp, K. (2012). The gamification of learning and instruction: game-based methods and strategies for training and education. San Francisco, CA: Pfeiffer.