Preserving the digital

From physical to digital to…?

At Bristol Culture we aim to collect, preserve and create access to our
collections for use by present and future generations. We are increasingly dealing with digital assets amongst these collections – from photographs of our objects, to scans of the historical and unique maps and plans of Bristol, to born-digital creations such as 3D scans of our Pliosaurus fossil. We are also collecting new digital creations in the form of video artwork.

Photo credit Neil McCoubrey

One day we won’t be able to open these books because they are too fragile – digital will be the only way we can access this unique record of Bristol’s history, so digital helps us preserve the physical and provides access. Inside are original plans of Bristols most historic and well-known buildings including the Bristol Hippodrome, which require careful unfolding and digital stitching to reproduce the image of the full drawing inside.

Plans of the Hippodrome, 1912. © Bristol Culture

With new technology comes new opportunities to explore our specimens and this often means having to work with new file types and new applications to view them.  

This 3D scan of our Pliosaurus jaw allows us to gain new insights into the behavior and biology of this long-extinct marine reptile.

Horizon © Thompson & CraigheadThis digital collage by Thompson & Craghead features streaming images from webcams in the 25 time zones of the world. The work comes with a Mac mini and a USB drive in an archive box and can be projected or shown on a 42″ monitor. Bristol Museum is developing its artist film and video collection and now holds 22 videos by artists including Mariele Neudecker, Wood and Harrison, Ben Rivers, Walid Raad and Emily Jacir ranging from documentary to structural film, performance, web-based film and video and animation, in digital, video and analogue film formats, and accompanying installations.

What could go wrong?

So digital assets are helping us conserve our archives, explore our collections and experience new forms of art, but how do we look after those assets for future generations?

It might seem like we don’t need to worry about that now but as time goes by there is constant technological change; hardware becomes un-usable or non-existent, software changes and the very 1s and 0s that make up our digital assets can be prone to deteriorating by a process known as bitrot!.  Additionally, just as is the case for physical artifacts, the information we know about them including provenance and rights can become dissociated.  What’s more, the digital assets can and must multiply, move and adapt to new situations, new storage facilities and new methods of presentation. Digital preservation is the combination of procedures, technology and policy that we can use to help us prevent these risks from rendering our digital repository obsolete. We are currently in the process of upskilling staff and reviewing how we do things so that we can be sure our digital assets are safe and accessible.

Achieving standards

It is clear we need to develop and improve our strategy for dealing with these potential problems, and that this strategy should underline all digital activity where the result of that activity produces output which we wish to preserve and keep.  To rectify this, staff at the Bristol Archives, alongside Team Digital and Collections got together to write a digital preservation policy and roadmap to ensure that preserved digital content can be located, rendered (opened) and trusted well into the future.

Our approach to digital preservation is informed by guidance from national organisations and professional bodies including The National Archives, the Archives & Records Association, the Museums Association, the Collections Trust, the Digital Preservation Coalition, the Government Digital Service and the British Library. We will aim to conform to the Open Archival Information System (OAIS) reference model for digital preservation (ISO 14721:2012). We will also measure progress against the National Digital Stewardship Alliance (NSDA) levels of digital preservation.

A safe digital repository

We use EMu for our digital asset management and collections management systems. Any multimedia uploaded to EMu is automatically given a checksum, and this is stored in the database record for that asset. What this means is that if for any reason that file should change or deteriorate (which is unlikely, but the whole point of digital preservation is to have a mechanism to detect if this should happen) the new checksum won’t match the old one and so we can identify a changed file.

Due to the size of the repository, which is currently approaching 10Tb, it would not be practical to this manually, and so we use a scheduled script to pass through each record and generate a new checksum to compare with the original. The trick here is to make sure that the whole repo gets scanned in time for the next backup period because otherwise, any missing or degraded files would become the backup and therefore obscure the original. We also need a working relationship with our IT providers and an agreed procedure to rescue any lost files if this happens.

With all this in place, we know that what goes in can come back out in the same state -so far so good. But what we cant control is the constant change in technology for rendering files – how do we know that the files we are archiving now will be readable in the future? The answer is that we don’t unless we can migrate from out of date file types to new ones. A quick analysis of all records tagged as ‘video’ shows the following diversity of file types:

(See the stats for images and audio here).  The majority are mpeg or avi, but there is a tail end of various files which may be less common and we’ll need to consider if these should remain in this format or if we need to arrange for them to be converted to a new video format.

Our plan is to make gradual improvements in our documentation and systems in line with the NDSA to achieve level 2 by 2022:

 

The following dashboard gives an idea of where we are currently in terms of file types and the rate of growth:

Herding digital sheep

Its all very well having digital preservation systems in place, but the staff culture and working practices must also change and integrate with them.

The digitisation process can involve lots of stages and create many files

In theory, all digital assets should line up and enter the digital repository in an orderly and systematic manner. However, we all know that in practice things aren’t so straightforward.

Staff involved in digitisation and quality control need the freedom to be able to work with files in the applications and hardware they are used to without being hindered by rules and convoluted ingestion processes. They should to be allowed to work in a messy (to outsiders) environment, at least until the assets are finalised. Also there are many other environmental factors that affect working practices including rights issues, time pressures from exhibition development, and skills and tools available to get the job done. By layering new limitations based on digital preservation we are at risk of designing a system that wont be adopted, as illustrated in the following tweet by @steube:

So we’ll need to think carefully about how we implement any new procedures that may increase the workload of staff. Ideally, we’ll be able to reduce the time staff take in moving files around by using designated folders for multimedia ingestion – these would be visible to the digital repository and act as “dropbox” areas which automatically get scanned and any files automatically uploaded an then deleted. For this process to work, we’ll need to name files carefully so that once uploaded they can be digitally associated with the corresponding catalogue records that are created as part of any inventory project. Having a 24 hour ingestion routine would solve many of the complaints we hear from staff about waiting for files to upload to the system.

 

Automation can help but will need a human element to clean up and anomalies

 

Digital services

Providing user-friendly, online services is a principle we strive for at Bristol Culture – and access to our digital repository for researchers, commercial companies and the public is something we need to address.

We want to be able to recreate the experience of browsing an old photo album using gallery technology. This interactive uses the Turn JS open source software to simulate page turning on a touchscreen featuring in Empire Through the Lens at Bristol Museum.

Visitors to the search room at Bristol Archives have access to the online catalogue as well as knowledgeable staff to help them access the digital material. This system relies on having structured data in the catalogue and scripts which can extract the data and multiemdia and package them up for the page turning application.

But we receive enquiries and requests from people all over the world, in some cases from different time zones which makes communication difficult. We are planning to improve the online catalogue to allow better access to the digital repository, and to link this up to systems for requesting digital replicas. There are so many potential uses and users of the material that we’ll need to undertake user research into how we should best make it available and in what form.

 

Culture KPIs

There are various versions of a common saying that ‘if you don’t measure it you can’t manage it’. See Zak Mensah’s (Head of Transformation at Bristol Culture) tweet below. As we’ll explain below we’re doing a good job of collecting a significant amount of Key Performance Indicator data;  however, there remain areas of our service that don’t have KPIs and are not being ‘inspected’ (which usually means they’re not being celebrated). This blog is about our recent sprint to improve how we do KPI data collection and reporting.

The most public face of Bristol Culture is the five museums we run (including Bristol Museum & Art Gallery and M Shed), but the service is much more than its museums. Our teams include, among others; the arts and events team (who are responsible the annual Harbour Festival as well as the Cultural Investment Programme which funds over 100 local arts and cultural organisations in Bristol); Bristol Archives; the Modern Records Office; Bristol Film Office and the Bristol Regional Environmental Recording Centre who are responsible for wildlife and geological data for the region.

Like most organisations we have KPIs and other performance data that we need to collect every year in order to meet funding requirements e.g. the ACE NPO Annual Return. We also collect lots of performance data which goes beyond this, but we don’t necessarily have a joined up picture of how each team is performing and how we are performing as a whole service.

Why KPIs?

The first thing to say is that they’re not a cynical tool to catch out teams for poor performance. The operative word in KPI is ‘indicator’; the data should be a litmus test of overall performance. The second thing is that KPIs should not be viewed in a vacuum. They make sense only in a given context; typically comparing KPIs month by month, quarter by quarter, etc. to track growth or to look for patterns over time such as busy periods.

A great resource we’ve been using for a few years is the Service Manual produced by the Government Digital Service (GDS) https://www.gov.uk/service-manual. They provide really focused advice on performance data. Under the heading ‘what to measure’, the service manual specifies four mandatory metrics to understand how a service is performing:

  • cost per transaction– how much it costs … each time someone completes the task your service provides
  • user satisfaction– what percentage of users are satisfied with their experience of using your service
  • completion rate– what percentage of transactions users successfully complete
  • digital take-up– what percentage of users choose … digital services to complete their task

Added to this, the service manual advises that:

You must collect data for the 4 mandatory key performance indicators (KPIs), but you’ll also need your own KPIs to fully understand whether your service is working for users and communicate its performance to your organisation.

Up until this week we were collecting the data for the mandatory KPIs but they have been  somewhat buried in very large excel spreadsheets or in different locations.  For example our satisfaction data lives on a surveymonkey dashboard. Of course, spreadsheets have their place, but to get more of our colleagues in the service taking an interest in our KPI data we need to present it in a way they can understand more intuitively. Again, not wanting to reinvent the wheel, we turned to the GDS to see what they were doing. The service dashboard they publish online has two headline KPI figures followed below with a list of the departments which you can click into to see KPIs at a department level.

Achieving a new KPI dashboard

As a general rule, we prefer to use open source and openly available tools to do our work, and this means not being locked into any single product. This also allows us to be more modular in our approach to data, giving us the ability to switch tools or upgrade various elements without affecting the whole system. When it comes to analysing data across platforms, the challenge is how to get the data from the point of data capture to the analysis and presentation tech – and when to automate vs doing manual data manipulations. Having spent the last year shifting away from using Excel as a data store and moving our main KPIs to an online database, we now have a system which can integrate with Google Sheets in various ways to extract and aggregate the raw data into meaningful metrics. Here’s a quick summary of the various integrations involved:

Data capture from staff using online forms: Staff across the service are required to log performance data, at their desks, and on the move via tablets over wifi. Our online performance data system provides customised data entry forms for specific figures such as exhibition visits. These forms also capture metadata around the figures such as who logged the figure and any comments about it – this is useful when we come to test and inspect any anomalies. We’ve also overcome the risk of saving raw data in spreadsheets, and the bottleneck often caused when two people need to log data at the same time on the same spreadsheet.

Data capture directly from visitors: A while back we moved to online, self-completed visitor surveys using SurveyMonkey and these prompt visitors to rate their satisfaction. We wanted the daily % of satisfied feedback entries to make its way to our dashboard, and to be aggregated (both combined with data across sites and then condensed into a single representative figure). This proved subtly challenging and had the whole team scratching our heads at various points thinking about whether an average of averages actually meant something, and furthermore how this could be filtered by a date range, if at all.

Google Analytics:  Quietly ticking away in the background of all our websites.

Google sheets as a place to join and validate data: It is a piece of cake to suck up data from Google Sheets into Data Studio, provided it’s in the right format. We needed to use a few tricks to bring data into Google Sheets, however, including Zapier, Google Apps Script, and sheets Add-ons.

Zapier: gives us the power to integrate visitor satisfaction from SurveyMonkey into Google Sheets.

Google apps script: We use this to query the API on our data platform and then perform some extra calculations such as working out conversion rates of exhibition visits vs museum visits. We also really like the record macro feature which we can use to automate any calculations after bringing in the data. Technically it is possible to push or pull data into Google Sheets – we opted for a pull because this gives us control via Google Sheets rather than waiting for a scheduled push from the data server.

Google Sheets formulae: We can join museum visits and exhibition visits in one sheet by  using the SUMIFS function, and then use this to work out a daily conversion rate. This can then be aggregated in Data Studio to get an overall conversion rate, filtered by date.

Sheets Add-Ons: We found a nifty add-on for integrating sheets with Google Analytics. Whilst it’s fairly simple to connect Analytics to Data Studio, we wanted to combine the stats across our various websites, and so we needed a preliminary data ‘munging’ stage first.

Joining the dots…

1.) Zapier pushes the satisfaction score from SurveyMonkey to Sheets.

2.) A Google Sheets Add On pulls in Google Analytics data into Sheets, combining figures across many websites in one place.

3.) Online data forms save data directly to a web database (MongoDB).

4.) The performance platform displays raw and aggregated data to staff using ChartJS.

5.) Google Apps Script pulls in performance data to Google Sheets.

6.) Gooogle Data Studio brings in data from Google Sheets,  and provides both aggregation and calculated fields.

7.) The dashboard can be embedded back into other websites including our performance platform via an iframe.

8.) Good old Excel and some VBA programming can harness data from the performance platform.

logos
Technologies involved in gathering and analysing performance data across museums.

Data Studio

We’ve been testing out Google Data Studio over the last few months to get a feel for how it might work for us. It’s definitely the cleanest way to visualise our KPIs, even if what’s going on behind the scenes isn’t quite as simple as it looks on the outside.

There are a number of integrations for Data Studio, including lots of third party ones, but so far we’ve found Google’s own Sheets and Analytics integrations cover us for everything we need. Within Data Studio you’re somewhat limited to what you can do in terms of manipulating or ‘munging’ the data (there’s been a lot of munging talk this week), and we’re finding the balance between how much we want Sheets to do and how much we want Data Studio to do.

At the beginning of the sprint we set about looking at Bristol Culture’s structure and listing five KPIs each for 1.) the service as a whole; 2.) the 3 ‘departments’ (Collections, Engagement and Transformation) and 3.) each team underneath them. We then listed what the data for each of the KPIs for each team would be. Our five KPIs are:

  • Take up
  • Revenue
  • Satisfaction
  • Cost per transaction
  • Conversion rate

Each team won’t necessarily have all five KPIs but actually the data we already collect covers most of these for all teams.

Using this structure we can then create a Data Studio report for each team, department and the service as a whole. So far we’ve cracked the service-wide dashboard and have made a start on department and team-level dashboards, which *should* mean we can roll out in a more seamless way. Although those could be famous last words, couldn’t they?

Any questions, let us know.

 

 

Darren Roberts (User Researcher), Mark Pajak (Head of Digital) &  Fay Curtis (User Researcher)

 

 

 

Discover St. Paul’s Black History in a storymap and walks – test upload

By Tanja Aminata Bah, MA Curator-in-training at  M Shed / Social History Team

Discover Black History in St. Paul’s via a story map and walks
Always wanted to find out more about your local area? Ever wondered where the Bamboo Club was or where the St.Paul’s riots started? St. Paul’s is full of exciting stories waiting to be discovered with this new handy introduction to Black History in the area.

Over the course of the last year, I have been placed with Bristol Culture’s Social History Team at M Shed and Blaise Castle House Museum as part of my MA Curating at UWE Bristol. My interest in Black History, engagement and innovation through digital media in museum spaces lead me to my creating a story map reimagining, preserving and documenting key Black Bristolian stories as my final project. The map offers not just stories, which I gathered via a call out for information, but also showcases some unique, not yet published archival imagery of St. Paul’s and people in the area.
The map is fully integrated with Google Maps for Android and iPhones and can be used here in your browser.

How to use the map?

The map has different layers, which can be navigated via clicking (this icon). The map works best on mobile devices such as Android and iPhones. Simply open this blog post in your browser and click the enlarge icon in the right corner. This will lead you to the Google Maps integration, where you can scroll through the tours and layers of the map on the go.

Walking tours online

I have designed three unique walking tours, giving you insights while you explore the area. If you enable your GPS signal on your phone the tours will even lead you from stop to stop.

  1. Only have an hour to spare? Essential St. Paul’s is your brief 101 to St. Paul’s African Caribbean history since the 1950s. The hour-long stroll follows a leisurely flat course around the heart of St. Paul’s, Grosvenor road? and City Road and offers plenty to see in a short time. If you haven’t got internet on the go you can also download and print out a leaflet here.
  2. If you want to explore for a bit longer you can try out the walk Before The Riots. The walk is flat and will lead you from the Bamboo Club near Portland Square to the Empire Sports Club near St. Agnes, exploring St. Paul’s between 1950 and 1980.
  3. Want it all? The Full Walk will lead you from the Bamboo Club to Ashley Parade on a 2hour uphill course. You will learn all about the African Caribbean community in St.Paul’s and Montpellier before heading to St.Werburghs to learn about two Victorian and Edwardian Black Bristolian families.

St. Paul’s Vibes

While you are out and about exploring you can listen to a selection of my favourite tracks that remind me of St. Paul’s, including many Bristolian artists such as massive attack alongside classics of Calypso and Roots Reggae, which enjoyed a popular following in St. Paul’s.

Finding out more

Got curious and want to find out more about some stories? Here is a handy list to find out more about Black History in and around St. Paul’s.  

The project would not have been possible without my mentor Catherine Littlejohns, curator of Social History, as well as the kind support of Bristol Museum, M Shed, Bristol Archives and UWE staff alongside local stakeholders. Thank you!

Tanja Aminata Bah (Twitter: @jakumata, tanja2.bah@live.uwe.ac.uk)  is a MA Curating Student at UWE Bristol and is placed as curator- in- training with M Shed and the Social History team. In her studies, she is interested in the crossroads between history, representation and digital developments in the heritage field. She holds a B.A. in History and African Studies from University of Cologne.   Her studies are supported by the Rosa Luxemburg Foundation. 

Exhibitions online

We recently (softly softly) went live with Exhibitions Online.

A place to translate our in-house exhibitions for an online audience, we worked with Mike and Luke at Thirty8 Digital to create a narrative structure with scroll-through content and click-through chapters on WordPress. They built in lovely features such as object grids, timelines, slideshows, maps and quotes.

There are a few exhibitions already up, past (death: the human experience) present (Empire through the Lens) and future (What is Bristol Music?). We’ve most recently used it for our European Old Masters gallery to showcase a beautiful painting we have on loan for two years: St Luke Drawing the Virgin and Child by Dieric Bouts (I discovered the Pantone app with this one, taking the red from the gallery to use online. V satisfying). I’m currently working with the exhibition team to get our Pliosaurus! exhibition up – watch this space for some fun things with that one, which we’re hoping to use for interp in our Sea Dragons gallery at Bristol Museum & Art Gallery too.

(For the What is Bristol Music? exhibition opening in May 2018, we’re using WP plugin Gravity Forms to collate peoples’ experiences and pictures of the Bristol music scene to be featured in the physical exhibition. Chip in if you have a story to tell.)

So far, we’ve found the content and arrangement really depends on the exhibition. The idea isn’t to simply put the physical exhibition online (I say ‘simply’, as if it would be) but instead to use the format and content of the exhibition to engage with people in a different environment: albeit one where we’re competing with a thousand other things for people’s attention. Exhibitions which have been and gone have been slightly more challenging, as the content was never intended for this use and has needed some wrangling. The more we use it though the smoother the process is getting, now that we know what we need and it being on teams’ plans as something to consider.

We’re still in the early stages of reviewing analytics to see how people are using it. Initial results are heartening, though, with a few thousand visits having had minimal promotion. At the moment most people are finding it from our what’s on pages (where most of our traffic to the main website is anyway) and we’re thinking about what campaigns we can do to get it out there more.

Any feedback or thoughts, hmu → fay.curtis@bristol.gov.uk

MA Final Project St. Paul’s Black History virtual map – Beta

Hi! I am Tanja, a current MA Curating student at UWE, placed as a Curator in training with the Social History Team in Bristol Culture since January 2017. I am interested in engagement work, black history and innovation through digital media in museums. Aside from assisting the Social History Team, I became involved with mainly digital developments, writing up a project proposal to redevelop the “Big Question Displays” in M Shed to address Brexit on a limited budget, as part of my course and writing up a new online collection highlight on “Green Bristol”. For my final project I aimed to contribute to the service through piloting something new and innovative, but rather budget friendly for the service, that crosscuts my interests.

I decided to develop a customized Google map to document black History in St. Paul’s, capturing some key stories of prominent Black Bristolians that were and are active in the area. Initially planned as a walking tour, one motivation for me was to preserve these stories in an ever-changing St. Paul’s and reimagine these for an online audience, who might want to access the map remotely and as a “gateway” of getting first insides into Black History. While focussing on the African Caribbean community from the 1950s I wanted to design something speaking to digital natives and older generations alike. One of my inspirations was “Black Histories London”, a research project capturing the black presence in London from 1958 to 1981 by Rob Waters, who works for University of Sussex and the Sussex Humanities Lab.

I started contacting local stakeholders in June to September, reached out via our Bristol Museums blog and researched intensively in the archives, while tracing back old material from other service affiliated projects such as the Black Bristolian Learning Resource and the Bristol Black Archives Partnership, to combine information into this new digital offering.

Over the last months, I developed a prototype, that I wanted to share with you as an early beta test to gather feedback. The prototype will be trailed with some members of the Bristol Culture Youth Panel on Wednesday 8th November 2017, in a feedback workshop, as well. Although this prototype is fully functional, it is not yet revised in its size and scope as such. Texts for the stations are still earliest drafts, some pictures will change, and some stations will not end up in the final version.

At the moment I am looking at the following questions:

  • How is the layout and design working?
  • Should I use multiple layers sorting stories after themes, instead of one full layer?
  • Should I do a second map for possible walking routes or work in one map with layers?
  • How are the texts and the stations? Are they fully understandable? Does it contain unneccessary information?

 

The project is already fully integrated and tested into the google map app for Android and iPhones. To access it on the go, the user needs to open this blog post (and later the final blog post) with their browser and then click the “fullscreen”/ enlarge icon. This should automatically open the map on the google maps app.

The end product will be offered to the public via a blog post in late November and will hopefully be supplemented by a “Discover and Walk your own” Guide/Booklet as a pdf download. I am currently also seeking out possibilities to further integrate the legacy of the project in form of the map as a QR code label into M Shed.

It would be great to hear your feedback and ideas for improvement as well as general thoughts on this project. I have created a google survey to fill with your impressions and ideas. The form is completely anonymous and does not require any personal data here!
After the Youth Panel Workshop I will try to start systematically evaluating the different tools and map types I discovered and how this pilot is proceeding.

Off-line surveys: successfully not losing data

Losing survey data is a pain – unfortunately the events team lost six events worth of survey data they collected using off-line surveys. The team used iPads (cost per iPad is c.£320) to conduct surveys on software which was sourced outside our team (I’m not sure what system it was). They used the software on the basis that it claimed to offer off-line surveys i.e. without an internet connection /wi-fi. The idea was that they data could then be uploaded once the iPad was connected to the internet. When they came to do so, however, the data was simply not there and they had lost it all.

The events team came to the digital team this year to ask if we could help them with the public surveys for the 2017 Harbour Festival. The festival is held across much of Bristol City Centre and therefore in order to conduct surveys digitally using iPads we would need to do so without having to rely on having a wifi connection.Of course, one option would be to conduct the surveys with good old pen and paper, but as a digital-first service we were happy to accept the challenge.

One of the main reasons we want to avoid paper surveys is because it is time consuming and difficult to digitise the survey results. It requires someone to sit at a computer and manually input results. Staff resources are often limited and this is a job we’d rather not have to give ourselves. Practically, paper can also be unruly, there are issues with handwriting legibility and they are easy to lose when relying on volunteers to collect them so a digital a solution is very desirable.

The challenge came down to finding the right software that I could install on the iPad and test, and that didn’t cost too much. Our usual platform for conducting surveys on iPads where we do have an internet connection is SurveyMonkey (we pay for the gold subscription £230 per year). Unfortunately, off-line surveys are not a feature available on SurveyMonkey.

These are a few Apps I tried to use but weren’t right for one reason or another:

  • Qualtrics – poor trialling options and expensive for full features £65 for one month or £435 for one year
  • iSurveys (harvestyourdata.com) – free account is limited and their main website is difficult to use and I couldn’t work out how much the full feature product was
  • SurveyPocket by QuestionPro – trial difficult to use and full feature pricing only available by contacting the company
  • The one I almost went for: QuickTap Survey & Form Builder – good pricing options from $16 per month and the trial is OK

So, after trawling the internet and the App Store for options the one we went for is an App called Feed2go (www.feed2go.com)

Quick Note: Before I speak about the virtues of Feed2go, I have to make it clear that it is currently only available on the Apple App Store; it is not available on Android devices in the Play Store (quicktap surveys app is available on Android).

I downloaded the feed2go App onto my iPad and and it was ready to go with pretty much all features available – certainly enough to get a feel of whether it was right. Most crucially on the basic/trial version you can conduct off-line surveys and test if the data is secure and can be successfully uploaded – we I did and it worked. A major advantage of the feed2go app is that to access all the app’s features (Pro) is a very reasonable subscription of £2.49 for 1 month; £4.99 for 3 months; or £12.49 for 1 year. At these costs there is virtually no risk in trying the Pro subscription.

If anyone is interested in trying the App, I would suggest going ahead and downloading and having an explore. There are just a couple of things I will highlight:

  • The user interface is nice and clean and easy to use
  • The options for question structures is OK and covers most bases but it is more limited than something like SurveyMonkey
  • Some of the navigation in the App can be a bit clunky especially when designing survey forms, but once you get used to it then it’s fine
  • Probably the most significant feature of feed2go to mention is trying to use the same survey on one device. This is not a particular strong suit of feed2go but it does work. Basically you need to download feed2go on each device you have and then share the survey between them using a cloud storage server – the best one to use in my experience is DropBox. In the App there is an export/import function to share survey forms between devices. This also means that you will need to collate all results from different devices at the end.
  • As noted above, the feed2go app needs to be downloaded on each iPad. In our case all our service iPads are registered to one email address. This means we can use the one subscription across all of our devices. This is not the case if iPads are registered to different email addresses – a subscription will need to be paid for each.

Overall, yes the experience of using the App could be improved a little. But, the main feature we wanted it for – to save the results and successfully upload them worked 100%. I think what distinguishes feed2go from the previously (unsuccessfully) used software was that it operated through a web browser which relied on a cache of temp internet files files. Feed2go is an app which stores the data securely in a folder in the same way the camera stores photos on the iPad. Finally, the FAQ on the feed2go and the email support for the App is great; the developer is really responsive.

We have now used the App to conduct surveys in the estate around Our Blaise Castle House Museum site and we are planning to replace paper exit surveys at our houses (where we don’t have wifi) with the offline App.

If you have any comments or questions about doing offlien surveys or surveys in the cultural sector please get in touch I’m happy to have a chat. darren.roberts@bristol.gov.uk

Going digital with our Exhibition Scheduling Timeline

 

 

developing a digital timeline for scheduling exhibitions

BACKGROUND

Having a visual representation of upcoming exhibitions, works, and major events is important in the exhibition planning process. Rather than relying on spotting dates that clash using lists of data, having a horizontal timeline spread out visually allows for faster cross-checking and helps collaboratively decide on how to plan for exhibition installs and derigs.

 

Until recently we had a system that used excel to plan out this timeline, by merging cells and coloring horizontally it was possible to manually construct a timeline. Apart from the pure joy that comes from printing anything from Excel, there were a number of limitations of this method.

  • When dates changed the whole thing needed to be rejigged
  • Everyone who received a printed copy at meetings stuck that to the wall and so date changes were hard to communicate.
  • We need to see the timeline over different scales – short term and long term, so this means using 2 separate excel tabs for each, hence duplication of effort.
  • We were unable to apply any permissions
  • The data was not interoperable with other systems

TIMELINE SOFTWARE (vis.js)

Thanks to Almende B.V. there is an open source timeline code library available at visjs.org/docs/timeline so this offers a neat solution to the manual task of having to recast the timeline using some creative Excel skills each time. We already have a database of Exhibition dates following our digital signage project and so this was the perfect opportunity to reuse this data, which should be the most up to date version of planned events as it is what we display to the public internally in our venues.

IMPLEMENTATION

The digital timeline was implemented using MEAN stack technology and combines data feeds from a variety of sources. In addition to bringing in data for agreed exhibitions, we wanted a flexible way to add installations, derigs, and other notes and so a new database on the node server combines these dates with exhibitions data. We can assign permissions to different user groups using some open source authentication libraries and this means we can now release the timeline for staff not involved in exhibitions, but also let various teams add and edit their own specific timeline data.

The great thing about vis is the ease of manipulation of the timeline, users are able to zoom in and out, and backward and forwards in time using with mouse, arrow or touch/pinch gestures.

 

Zoomed out view for the bigger picture
Zoomed in for the detail…

EMU INTEGRATION

The management of information surrounding object conservation, loans and movements is fundamental to successful exhibition development and installation. As such we maintain a record of exhibition dates in EMu, our collections management software. The EMu events module is used to record when exhibitions take place and also the object list where curators select and deselect objects for exhibition. Using the EMU API we are able to extract a structured list of Exhibitions information for publishing to the digital timeline.

HOW OUR TIMELINE WORKS

Each gallery or public space has its own horizontal track where exhibitions are published as blocks. These are grouped into our 5 museums and archives buildings and can be selected/deselected from the timeline to cross reference each. Once logged in a user is able ot manually add new blocks to the timeline and these are pre-set to “install”, “derig” and “provisional date”. Once a block is added our exhibitions team are able to add notes that are accessible on clicking the block. It is also possible to reorder and adjust dates by clicking and dragging.

IMPACT

The timeline now means everyone has access to an up to date picture of upcoming exhibitons installations to no one is out of date. The timeline is on a public platform and is mobile accessible so staff can access it on the move, in galleries or at home. Less time is spent on creative Excel manipulation and more work on spotting errors. It has also made scheduling meetings more dynamic, allowing better cross referencing and moving to different positions in time. An unexpected effect is that we are spotting more uses for the solution and are currently investigating the use of it for booking rooms and resources. There are some really neat things we can do such as import a data feed from the timeline back into our MS Outlook calendars  (“oooooh!”). The addition of thumbnail pictures used to advertise exhibitions has been a favorite feature among staff and really helps give an instant impression of current events, since it reinforces the exhibition branding which people are already familiar with.

ISSUES

It is far from perfect! Several iterations were needed to develop the drag and drop feature fo adding events. Also, we are reaching diminishing returns in terms of performance – with more and more data available to plot, the web app is performing slowly and could do with further optimisation to improve speed. Also due to our IT infrastructure, many staff use Internet Explorer and whilst the timeline works OK, many features are broken on this browser without changes to compatibility and caching settings on IE.

WHAT’S NEXT

Hopefully optimisation will improve performance and then it is full steam ahead with developing our resource booking system using the same framework.

 

 

Rowan Whitehouse joins the Digital Team

Hello! My name is Rowan Whitehouse and I am currently working as a cultural support apprentice for Bristol Museums.

I have been doing six week rotations around various departments, and as part of my third, with the digital team, I’ve been asked to review some of the technology around the museum.

So, to find some!

I noticed that the distribution of technology around the museum is heavier in areas with a higher number of children. Whilst there is a lot around the ground floor, particularly the Egypt and Natural History galleries, levels definitely drop off the more steps you climb, towards the Fine and Applied Arts galleries. I think this is due, in part, to many children’s interests leaning on the dinosaur/mummy side, rather than Bristol’s history of stone pub ware. Perhaps there are also certain established ideas about what an art gallery should  be, whereas many of the historic collections lend themselves well to interactive displays.

Upstairs, the technology has a distinctly more mature focus.
I chose to look at a tablet/kiosk in the European Old Masters gallery for an example. The kiosk itself fits well into its surroundings, the slim, white design is unobtrusive – something desirable in such a traditional gallery space. The kiosk serves as an extension of the wall plaques, it has an index of the paintings in the room with information on them. I think this is a great idea as the size of wall plaques often constrain the amount of information available.

A big drawback I felt however, was that the kiosk was static and fixed in one place. I observed that as people moved around the gallery they would continually look from the painting to it’s accompanying plaque, taking in both at the same time. Though the kiosk has more information, it would need to be able to move with the user to have the advantage over the plaques. On the position of the kiosk itself, I think it would receive more use if it was positioned in the middle of the room, rather than in the corner, where it is overlooked. Signage on the wall advertised a webpage, which could be accessed on a handheld device and provided the same information as the kiosk. I felt this was a better use of the index, and could be made even easier to access via a QR code. I wonder though, if people would want to use their phones like this in a gallery, and whether ideas about the way we experience art are the ultimate obstacle. I’ll be researching how other institutions use (or don’t use) technology in their galleries.

I wanted to see how technology is being used differently with the historic collections, so I headed back downstairs to the Egypt gallery. I observed a school group using the computers at the back of the gallery, both the children and their teacher struggled with the unusual keyboard layout and rollerball mouse, unable to figure out how to search. Eventually, they came upon it by chance, and enjoyed navigating the images and learning more about the objects in the gallery. The computers also have a timeline view, showing the history of the Egyptians, and an “Explore” function, where specific subjects could be looked at.

I think the location of the units massively benefit interaction, the dedicated space with chairs really invite and encourage visitors to engage. On using the technology, I felt that the access problems could be easily fixed by some stickers highlighting the left mouse button function, and something to resolve the stiffness of the rollerball.

My favourite interactive pieces in the museum were in the Egypt gallery. I loved the screens that featured the discovery of  a body, and asked the user what they thought about the body being in a museum, and gave the user the option of viewing the body at the end of the text. I felt like this type of interaction was fantastic, and rather than just providing information, engaged the visitor directly and was a great way of broaching questions that may not usually occur to visitors.

I’m looking forward to the next six weeks, and learning more about digital engagement in museums.

With such a fantastic collection, it’s exciting finding new ways of presenting it and helping visitors interact with objects

New locker alert

Photo showing grey lockers

Adding additional lockers to our museums is a top 5 request from the public and staff alike. On Wednesday the 20th September we installed new lockers at Bristol Museum & Art Gallery and M Shed.

Until this week we only had 8 lockers at Bristol Museum & Art Gallery which is not exactly lots when you have 400,000 plus visits. At both museums the lockers have been finished in a suitable RAL colour way. We’ve introduced a £1 non-refundable fee which will initially re-pay the cost of lockers then be used to support our work. Slow money but sure money.

The main considerations for lockers are:

  • Custom brand colours
  • Coin retention lockers
  • The number of doors per locker – 2, 3 or 4 (more lockers more money but less useful if size is important)
  • installation
  • Location in the building
  • Disclaimers and cost messaging

The install didn’t quite go to plan. I asked for lockers. I got lockers. However I also needed the following which I hadn’t specified:

  • Numbered lockers – inserts so that the public can remember which locker they used
  • Numbered key fobs – the public need to know which key they have
  • Nuts and bolts – to connect each locker together and to the wall to eliminate the chance for the locker to tip over

I purposely located a bank of lockers in the corridor so that they’ll be in the sightline of visitors to the front hall. Previously the lockers were tucked away and a constant frustration for visitors. Regular readers will know one of my favourite quotes “Address the user need and the business need will be clear”.

Retail will be responsible for collecting the income with this new welcome income stream.

I was chuffed when one of our Visitor Assistants said “I’ve been here over 10 years and never thought I’d see the day we added extra lockers”.

If you remember to address the bullet points above you’ll have a smooth installation. Good luck.

Update

As of Sept 2018 we successfully recouped the costs within six months and produced a 1x return (basically doubled revenue to cost) and expect a 2x return now annually.

Results of running a shop in the front hall

Photo of our front hall shop

Just before summer started I wrote about taking the plunge having an additional small shop in the front hall at Bristol Museum and Art Gallery throughout the summer. Summer is over and so is our shop, for now. This post is a summary of performance. In short, the shop was worth doing with the following results:

  • 10% of total sales for the month or £5,300 net sales
  • 939 orders
  • 2% conversion (confirmed it was additional sales not just taking sales from main shop)
  • Same ATV as main shop despite selling far fewer products
  • Staffing costs were covered by moving the second retail assistant from the main shop
  • £400 on a whacking huge shop sign (pictured) and a few units to supplement existing available units
  • Answered countless public enquiries

I would have been kicking myself if we hadn’t tried to push the retail needle and i’m glad we did. We had our most successful month on record and met our income target. The front hall shop helped us over the line in this respect. We started on day 1 of the school summer holiday without much of a plan other than a hunch the products should be suitable for kids and tourists. Over the course of the project we chopped and changed the products as the teams powers of observation dictated.

Staffing

Finding staff at short notice proved to be a challenge with a few moments but fortunately the team including our brilliant casual pool came to the rescue. I asked the team for feedback throughout the project. Everyone on the front hall shop said they enjoyed the shifts and kept busy in the quiet periods by pricing and preparing products. As expected they also answered lots of public enquiries and raised awareness of our retail offer. The main shop coped with having one person instead of two but this made their work full-on and its much appreciated. We will use the feedback to see how we can better ease the load in the coming year, especially dealing with deliveries.

Product selection

We used the existing products from our range and started with best sellers with a Bristol theme. We thought this would appeal to the tourists. However the Bristol theme didn’t really push the needle so we switched to more ‘kids’ and ‘home’ which proved successful.  I really wanted to try a selection of jewellery but the hall is used frequently for evening events so we decided this could wait until a future project.

Positive fringe benefits

We don’t have staff permanently in our front hall anymore so having staff here was good for the public enquiries. The hall is large and having visitors milling around the shop gave a nice vibe to the space. Staff received lots of positive comments about the offer and many said they’d be back for Christmas shopping. Some visitors completely miss the fact we have a shop so this made sure 100% knew we had an offer.

Next time

I have decided that the front hall shop should come back at high visitor times so October half-term and then from late November until the end of the school holidays. We need to plan the range further in advance and be very mindful that December is peak evening event season so everything must be easily movable. We did indeed push the needle.

Onwards!