“Bring Your Own Device” (BYOD) programming facilitates accessibility for people who are blind or have low vision

Cheryl Fogle-Hatch, independent professional, USA


This paper reports on two methods of delivering information to blind audiences via their preferred accessibility settings on their smart phones. The methods are compared by ease of set-up and use for both creators and users.

Keywords: accessibility, BYOD, smartphone, blind, low vision

The trend to bring your own device (BYOD) into the museum has the potential to increase access for people who are blind or have low vision if exhibit content is designed to take advantage of the accessibility features built into these devices. Some accessibility options available on smartphones and tablets include color contrast, screen or print magnification, voice command such as Apple®’s Siri, and screen readers (voice output) such as Voice Over for iOS and Talk Back for the Android™ operating system (Irvine et al., 2014). A recent survey (Martiniello et al. 2019) demonstrated that people who are blind or have low vision employ smartphones and tablets for tasks such as object identification, navigation, listening to audiobooks, reading eBooks and optical character recognition (scanning or converting hard copy and electronic image materials to accessible electronic formats). Therefore, if media is designed to be accessible, individuals may use their preferred accessibility settings on their personal device to explore an exhibit.

This paper describes and compares two BYOD accessibility solutions implemented by independent project teams of which the author was a member during 2018 and 2019 respectively. The first case study was a prototype design for a traveling archaeological exhibit that contained 3-D printed replicas of stone tools; the prototype included standardized placement of QR-codes that could be scanned to access information about the replica artifacts.  The second case study utilized NFC (Near Field Chip) and Bluetooth technologies to scan and access label text at a tactile art exhibition using the proprietary WayAround iPhone and Android app. The findings in this paper are inherently exploratory because they document the author’s reflections at the conclusion of both projects.

Nonvisual accessibility was an integral design component because both project teams were comprised of a mix of blind and sighted members.  This motivated the teams to create an inclusive experience that would be meaningful to all visitors regardless of their visual acuity.  Both project teams chose to implement off-the-shelf accessibility solutions rather than building their own infrastructure. These decisions were influenced by the short duration and small scale of both projects. Also in both cases, professional relationships between specific project team members and people employed at technology companies were crucial for the successful implementation of these BYOD solutions.  The projects provide examples of how exhibit content can be made accessible to visitors who are blind or have low vision while also being inclusive to sighted audiences.


Prototype accessible travelling exhibit

Two archaeologists and one engineering professor designed a prototype of an inclusive and accessible traveling museum exhibit for both blind and sighted visitors (Fogle-Hatch et al. 2018). The exhibit consisted of 3-D replicas of stone projectile points (spear tips and arrowheads) that could be physically manipulated to be explored tactually. Additionally, QR-codes were included that could be scanned with a smartphone (Lacoma, 2018). The QR-codes directed the user to a webpage that displayed information about the artifacts. This text could be accessed using screen magnification or voice output while also being available for sighted users. Consequently, the exhibit was inclusive since it could be experienced by mixed groups of visitors, whether they happened to be sighted or blind.

The artifacts scanned to produce 3-D printed replicas in this prototype exhibit came from the collections of the Maryland Archaeological Conservation Laboratory, located at the Jefferson Patterson Park and Museum https://jefpat.maryland.gov. Stone projectile points (spear tips and arrowheads) were chosen for this case study because they are relatively inexpensive to replicate as 3-D printed models given their small size (about 5-10 cm or 2-4 inches in length). Furthermore, these stone tools can be distinguished tactually based on differences in their size, shape, and surface features. The projectile points have a triangular tip, a somewhat wider midsection, and a base specifically shaped to attach to the spear or arrow shaft. Although the tools are generally uniform in thickness, they are not entirely flat. The surface of each projectile point contains impressions, called flake scars, that were left behind as material was removed during manufacture.

On May 1, 2018, Staff at “Direct Dimensions” — a company specializing in 3-D scanning, imaging and reproduction technologies — scanned the projectile points using a Faro Edge Arm, a laser scanning system that produces extremely reliable 3-D scans of complex and intricate objects. Data scanned from each artifact was saved to a laptop along with a picture of the hand-written artifact tag that listed the archaeological site where it was found. The data from each scan was processed into a 3-D model at the Direct Dimensions facility in Owings Mills, Maryland. 3-D prints using these scanned models were produced using an acrylic resin because that material has surface and hardness characteristics that were assessed to be an approximation of the tactile characteristics of the artifacts (figure 1).

Computer screen image of stone projectile points











Figure 1. 2-D image of stone projectile points modeled and arrayed on a computer screen prior to 3-D printing.


The prototype design required the addition of QR-codes that linked to relevant archaeological information about each 3-D Replica. First, the QR-code needed to be easy to scan, and second, it had to be in a standard location that could be located by touch. The project team tried several strategies to accomplish these goals.

A QR-code (short for “quick response” code) is a type of barcode that is square in shape and contains a matrix of dots (Christensson, 2015) that encodes additional information, such as a webpage URL. Using a free web-based tool (https://www.qr-code-generator.com/how-to-create-a-qr-code/), we created QR-codes that linked to specific pages on the website of the Maryland Archaeological Conservation Laboratory. One need only enter the URL to which the QR-code should be linked, and the tool generates the QR-code in common graphic formats.

The QR-codes generated are specific to one particular URL. If the URL changes, the associated QR-code must be updated to direct a smartphone to the new web site address. The museum has changed the URLs twice since May 2018. The QR-codes had to be re-created immediately prior to our test run of the prototype exhibit in October 2018. Then, in August of 2019, the museum moved its site to the standard domain of Maryland.gov and the QR-codes will have to be regenerated before displaying the prototype again. This suggests that designers of such exhibits should be in constant and direct communication with individuals who maintain the webpages so that similar issues do not interrupt access to information.

Initially, we tried including the QR-code in the 3-D model so that it was printed directly on the surface of each replica. This proved unworkable. While QR-codes are supposed to be very tolerant of surface variation and resolution of the embedded dots, when the QR-codes were printed directly onto the side of our replica artifacts, they were unreadable by our smartphones.

Then, we decided to physically attach the QR-code to the 3-D printed replica because we needed a very compact design for this solution.  This offered a standardized location for the QR-code that could be located by touch. We determined that adding material to one side of the base was the most effective way of creating an attachment point for the 3-D prints. Although this method has the disadvantage of altering features on one face (side) of the projectile point, it preserved the accurate flaking pattern on the other face. Each print was then attached by a sturdy lanyard connected to the wooden “coin” that contained the corresponding QR-code (figures 2a and 2b). Even with the addition of the lanyards and QR-code “coins,” the exhibit of 3-D replicas remains compact and durable (figure 3).

QR Code Coin

Figure 2a. Wooden “coin” with QR-code and lanyard.

3D printed stone point

Figure 2b. 3-D printed stone point with lanyard attachment.

Replica stone points attached to QR code coins

Figure 3. All 18 replica stone points, with affixed QR-code “coins”


Labor and materials were donated by Direct Dimensions. The cost of materials used in 3-D printing replicas and fabricating and attaching QR-codes was about $350. Readers interested in experimenting with our design process should verify that materials can be produced within their budget before proceeding.


Testing the Design

Our prototype was tested by approximately fifteen people, a mixed group of blind and sighted attendees at a tactile graphics symposium, held in October 2018 and organized by the National Federation of the Blind, Jernigan Institute, in Baltimore Maryland. We gave brief instructions to our audience in order to ensure that both blind and sighted people could access information provided by the QR-code. Attendees were instructed to hold their smartphones over the QR “coin” and open the camera app. We explained that the camera would recognize the QR-code and prompt the user to open a website. Our design of 3-D prints including a standardized, compact, and easy-to-locate placement of QR-codes along with brief instructions resulted in successful use by both blind and sighted people. The exhibit is inclusive because everyone could explore the 3-D printed replicas of stone projectile points tactually, and then they could obtain additional information about these replicas by scanning QR-codes with their own devices.

Once on the website, we directed attendees to find the text describing the projectile point type. For most attendees, their cursor was placed on the descriptive text, but for others, their cursor was on the masthead. In those cases, we directed them to scroll down, skipping the masthead of the museum’s website. In the future, this could be avoided by producing dedicated pages containing only the desired information. As new information becomes available for existing display materials, the webpages linked to the QR-codes can be updated, reducing costs for updates and redesigns.

The prototype exhibit design provides an example of methods that museums can employ to take advantage of both 3-D printing and the access technology common on smartphones to improve accessibility of information for visitors who are blind or have low vision. QR-codes and an attachment system like the one that we have described could be added to 3-D models, either scans of objects from museum collections, or downloads from online libraries such as sketchfab, thingiverse, shapeify, tactile etc. The attachment system could be added to pre-existing 3-D scanned models using free software such as K-3D, or paid software such as SketchUp. Provided that QR-codes are updated when changes are made to the linked URLS, they can improve accessibility to materials and information in museum exhibits.

Tactile Art exhibit

The second case study involved access to label text and information for an exhibition of tactile and multisensory art. The exhibition, titled Ways of Seeing, was designed to be accessible to both blind and sighted audiences providing an opportunity for an immersive art experience. The exhibit was at Gallery CA, a contemporary arts space located in Baltimore Maryland (http://www.galleryca.org). Attendance at the events and throughout the six week run, from June 7 to July 20, 2019, was estimated at 375 people, including both blind and sighted visitors.

The exhibition consisted of eight stations in a single gallery; seven were positioned around the edge of the room and one was a free-standing 3-dimensional installation in the center of the space. Art displayed included tactile paintings made with acrylic on canvas, ceramic sculpture, wood carving, and various works comprised of mixed media. Stations were created by ten artists; two stations were created by collaborating artist pairs, and the remaining six stations were produced by individual artists. Information regarding the exhibition remains available online at http://www.sarahbmccann.com/ways-of-seeing.php.

Ways of Seeing was designed, from the beginning, to be inclusive. To that end, the label text posted on the walls next to each work of art was produced in both hardcopy Braille and print. Information included descriptions of the artwork and artist statements. Braille copies of the station descriptions and directions to navigate the gallery were placed on a table by the gallery doors. Label text was also made available electronically via the WayAround Tag and Scan app for iOS or Android, and it could be accessed using an individual’s preferred accessibility settings. In addition to descriptions of artwork and artist statements, WayAround included descriptions of the stations and directions to navigate the exhibition.

The press release that announced the exhibition advised visitors that they could download the free WayAround app for iOS or Android to access information about the artwork. Although the app was publicized in advance, the author noticed that blind visitors downloaded and installed the app when they arrived at the gallery, usually before special events such as the opening, community art workshop, and panel discussion about tactile art. When opening the WayAround app for the first time, visitors were prompted to set up an account with their name, email, and password. Members of the organizing team received prior training in use of the WayAround app so that they could assist visitors with this process. In doing so, we noticed that account creation automatically added the associated email address to a distribution list to receive marketing communications.

Since WayAround employs Near Field Communications, an NFC reader (usually located near the top of the phone) was required to use the app. A second device, the WayLink, was needed for devices that lack NFC readers such as the iPhone 6. Set up for older devices required pairing with Bluetooth to the WayLink, a stand-alone device containing an NFC reader. Visitors who needed to pair their phones received help from members of the organizing team.


Using WayAround directions to navigate the gallery

Information about each artwork was programmed via Bluetooth into a WayTag, that was contained in a round sticker affixed to the top left of each print label. To accommodate sighted visitors, the stickers were confined to the margins of the print label so that text was not obscured. Stickers could be identified by touch, and the visitor could point the phone towards the sticker to scan the tag. A rounded bump (a door stop from a local hardware store) was placed on the baseboard immediately below the WayTag; visitors could find it with a white cane as they walked along the gallery wall. Alternatively, WayTags were imbedded into a 3-D map of the gallery that visitors could borrow to study the layout. Then they could scan the Waytags on the map.

digital illustration of people using smartphones with waytag in a gallery






Figure 4: Illustration showing how WayTags can be used in an art exhibit.

Woman with 3D Map around her neck

Figure 5: A blind artist, Marguerite Woods, wearing the 3-D map of Gallery CA with imbedded WayTags for the Ways of Seeing exhibition.

Station descriptions and directions to navigate the gallery were written jointly by the curator, one of the artists, and the author prior to the exhibition opening. The examples given below demonstrate the detailed information that was provided to visitors who are blind or have low vision.

“Station Four consists of two pedestals with one sculpture on each. Look for an added sound element on the sculpture to your left, by touching the copper tape the sounds will be activated. Turn right to continue to Station Five.”

“Station Six consists of a three paneled multimedia piece and a sculpture. On the wall are the three panels, two that will be completed in the community workshop on June 15, the first panel was done by the artist. To the right of the panels is the pedestal with the sculpture. This sculpture spins! Station Seven is a three-hundred-and-sixty-degree installation in the center of the room. To get there from Station Six turn right and walk forward.”

“Station Seven is in the center of the gallery. It is between two columns so if you get to one of them you are almost there! The station is a free-standing sculpture that extends out from the central portion and can be explored and experienced from all sides.”

Using WayAround as a BYOD solution

WayAround offers services to individuals, businesses, and other organizations. The company also offers a fee-based package “WayAround for Public Spaces” that includes programming tags and backing up content to the company’s servers. The art exhibition is featured prominently on the company’s website (https://www.wayaround.com/public). The company charges a monthly fee, listed on the website for public spaces, in addition to the cost of Waytags (estimated at one dollar each). It should be noted that WayAround provided in-kind support to the exhibition by donating access to the package for public spaces throughout its six week run. One employee programmed the WayTags, using text that we provided, and installed them before the exhibition opening.

Use of WayAround for public spaces may be a reasonable BYOD solution in specific cases related to small-scale projects. First, it could provide an enterprise-level cloud architecture for projects that lack their own online infrastructure for storing information to be accessed by personal devices. Second, WayAround could be used as a BYOD solution for projects having a short duration, such as an exhibition that is only on view for a few weeks. Furthermore, the app could be useful for projects that lack the capacity to establish and maintain infrastructure.



Both of the BYOD solutions were designed to utilize the accessibility features of smartphones that are routinely carried by people who are blind or who have low vision. Off-the-shelf solutions were implemented that allowed people to access information on their own devices using their preferred accessibility settings. These apps allow a greater degree of independence in accessing label text in both cases, and navigation in the case of the WayAround tags.

Adoption of a particular BYOD solution may be facilitated if the required tasks to access exhibit content are similar to those that are routine for people who are blind or have low vision. For example, scanning QR-codes is similar to using various apps to recognize text or identify objects. Likewise, employing the NFC reader to scan a WayTag uses gestures that are familiar to those who have used contactless payment systems such as Apple Pay and Android Pay. Visitors who are blind or have low vision may benefit when brief instruction is available for using the BYOD solution, especially if that instruction references some common tasks that they might complete on their own devices.

Acknowledging the rapid pace of technological change, Goldberg (2010) offers criteria that should be met by digital solutions for accessing museum exhibits. Goldberg’s observations concerning programming content, protecting user privacy, and use of industry standard formats, are of relevance to the BYOD solutions employed in the two case studies described here.

  • automated or near-automated content loading and reprogramming

Both case studies required programming of tags so that they would access specific text when scanned. The QR-code generator created codes that linked to specific  URLs through a fillable webform and took seconds to create. This information is not available from WayAround because the company completed programming as part of their donated services.

  • Uses non-proprietary, industry standard content formats

The QR-codes used in our prototype exhibit are industry standard (Christensson, 2015, Lacoma, 2018). By contrast, the WayAround app uses proprietary hardware (WayTags) and software (enterprise-level cloud architecture. While the WayAround system is viable, the requirement to use a proprietary app adds another task for visitors.  By contrast, the QR code system is already built into smartphones and the camera app can automatically recognize the QR-code and prompts visitors to access the indicated URL with the resulting information.

  • Protects user privacy

WayAround collects some private information from visitors during account creation. No account is necessary to scan a QR-code with the camera app that is built into the device’s operating system.

In summary, the QR-code solution uses non-proprietary hardware and software and protects user privacy because it does not collect data from visitors. The WayAround app uses proprietary hardware and software and collects some private information about users. The QR-code solution, being open-source, is recommended when projects are supported by infrastructure to store and access information relevant to the exhibit. However, The WayAround system, being self-contained, is recommended in cases where the project lacks infrastructure to store and access information.

These case studies demonstrate that small-scale projects can be both accessible to people who are blind or have low vision, and inclusive to sighted audiences. BYOD programming allows visitors who are blind or have low vision to access information using their preferred accessibility settings on their smartphones thus providing a greater degree of independence for the visitor. This is an effective method to deliver exhibit content, label text, and information useful for wayfinding. Particularly if non-proprietary hardware and software are employed, the BYOD solution may also be used by sighted visitors thus creating an integrated experience.



The prototype exhibit was designed by the author and Donald Winiecki, Ed.D., Ph.D., Boise State University, and Joe Nicoli, then of Heritage Scanning Specialist, Direct Dimensions, Inc. The research staff at the Maryland Archaeological Conservation Laboratory were crucial to this project, particularly Ed Chaney who granted us access to the artifact collections, and Rebecca Morehouse who helped us select projectile points for scanning. Michael Raphael, the CEO of Direct Dimensions provided significant in-kind support for this project that included donating the time and labor of employees to scan artifacts and produce the 3-D models. He also covered the materials cost for the 3-D replicas and QR-Code attachment system.

The Ways of Seeing exhibition was organized by a dedicated team of curators, artists, and tactile art enthusiasts: Lou Ann Blake, Jenny Callahan, Ann Cunningham, Cheryl Fogle-Hatch, Anil Lewis, Sarah McCann, Ellen Ringlein, Abigale Stangl and Marguerite Woods. Ways of Seeing was supported by a Creative Baltimore Fund Individual Artist Grant, Baltimore Office of Promotion & the Arts, Maryland State Arts Council Creativity Grant Program and 75 individuals who backed a Kickstarter campaign. Significant in-Kind support was provided by Gallery CA, Open Works Baltimore, WayAround, and the National Federation of the Blind.



Christensson, P. (2015, March 5). QR-code Definition. TechTerms. Sharpened Productions,  Retrieved August 30, 2018, from TechTerms website: https://techterms.com/definition/qr_code

Fogle-Hatch, C., J. Nicoli, and D. Winiecki, (2018, October 12).  Designing an accessible travelling exhibit: A Case Study Using 3-D Printed Replicas of Stone Tools From Archaeological Sites. Tactile Graphics in Education and Careers, National Federation of the Blind Jernigan Institute, Baltimore MD.

Goldberg, L. (2010). Exhibit design relating to low vision and blindness: Current media technology, appropriate application of technology, future research needs. Retrieved from http://www.ncaonline.org/docs/media-goldberg.pdf

Irvine, D., Zemke, A. , Pusteri, G. , Gerlach, L. , Chun, R. , & Jay, W. M.  (2014 ). Tablet and Smartphone Accessibility Features in the Low Vision Rehabilitation. Neuro-Ophthalmology, 38(2), 53 – 59.


Lacoma, T. (2018, October 3). How to scan a QR-code, Designtechnica Corporation. Retrieved August 30, 2018, from Digital Trends website: https://www.digitaltrends.com/mobile/how-to-scan-a-qr-code/

Martiniello N., W. Eisenbarth, C. Lehane, A. Johnson & W. Wittich (2019) Exploring the use of smartphones and tablets among people with visual impairments: Are mainstream devices replacing the use of traditional visual aids? Assistive Technology.


Cite as:
Fogle-Hatch, Cheryl. "“Bring Your Own Device” (BYOD) programming facilitates accessibility for people who are blind or have low vision." MW20: MW 2020. Published January 15, 2020. Consulted .