We summarize the methodology adopted to create a user-centric site for the Gemini observatory and present the strategy for the Gemini Digital Governance, which ensures consistency and sustainability of the observatory’s digital presence.
It is a common practice for astronomical observatories to use websites to communicate news to their communities, store and share manuals and documentation, and describe their equipment, staff and governance. But how many of those sites manage to satisfy both their visitors and their editors? How many observatories have a clear strategy for their digital presence? How often is the satisfaction rate of those websites assessed? These are questions that we addressed when Gemini observatory recently revamped its website with a more user-centric approach. It was an opportunity to revise how all the content was organized, presented and updated. In this publication, we summarize the methodology that was adopted to create the current site www.gemini.edu. We also present our strategy for the Gemini Digital Governance, which ensures consistency and sustainability of the observatory’s digital presence.
Astronomy departments and astronomical observatories, like many other science institutions, were among the first organizations to adopt the HTML language in the 1990’s and to get their exposure on the World Wide Web. Three decades later, the habits are more or less settled, and observatory websites and their maintenance are rarely an organizational priority.
That does not mean that all observatory websites are static. They are regularly undergoing design and technology updates, going from superficial to very deep ones. However, it is most of the time focusing only on the graphical design of the homepage and communication pages. Commonly internal teams are siloed, with content decisions divorced from any strategy, with a divergence between scientists and other departments. Common tasks such as updating the homepage may be difficult to perform in these environments as it may not be obvious who is responsible. It does not necessarily end up in catastrophic websites, as many look very nice and some are nonetheless fairly easy to use. But is there anything that could be done to systematically increase the quality and consistency of the service provided by observatories websites to their communities?
At the NSF’s NOIRLab/Gemini Observatory, we opted for a more holistic way to welcome visitors, whether they are a user of the observatory, an instrument builder, a member of a funding agency or an astronomy enthusiast. And our first step in making this happen was to learn more about our audiences by doing user experience research: asking them what they are looking for, how they are looking for it, and observing them performing these tasks. We also implemented digital governance, an observatory-wide framework for establishing accountability, roles, and decision-making authority for our digital products, enabling better maintenance and sustainability.
Here, we present our results of a 6-year project for a complete rework of the Gemini website structure and for the creation of a digital governance. We hope the following sections can help or inspire any other organization facing similar challenges regarding their website and digital product management.
In section 2, we describe briefly the Gemini website content management software (CMS) and related scripts. In section 3.1, we detail our motivations to re-structure the Gemini website. In section 4, we describe the different tests and assessments that drove the structure and the design of the new website. In section 5, we compare the performances of the old and the new websites, and discuss how our goals were met. In section 6, we present our digital governance and describe how it supports maintaining the website, guiding and supporting those who edit it and providing the framework for any digital presence of the observatory. Our final discussion and conclusions are presented in section 7.
The domain of the Gemini website is www.gemini.edu. The Gemini Statement of Purpose, “Exploring The Universe, Sharing its Wonders” has guided the purpose of the website for providing information to explore and share the wonders of the universe. Since 2014, the annual average number of pageviews is 700,000.
Its earliest web page was created circa 1994. Since then, more than thirty-thousand pages followed. In 2006, the majority of the webpages were migrated from static HTML into a Content Management System (CMS). The static webpage approval procedure of using a staging workflow was ported over into the CMS. The first CMS used for around 2 years was Joomla before it was changed for Drupal, which remains the CMS in use to this day.
The management of the website has evolved from:
the librarian creating most content;
each department (science operations, engineering, and public information and outreach) self-managing their own sections of the website;
Digital Governance, which will be described in section 6, creating strategy, policy, and procedures for defined teams.
The Gemini site addresses a wide range of audiences. Those are described in the section below.
Helpfully, the Gemini Statement of Purpose served as an audience identifier, “Exploring The Universe, Sharing its Wonders”. The two largest audiences for the Gemini website are scientists who intend to use the Gemini telescopes and the general public. In addition and in much smaller proportion are the various stakeholders. The following is a description of these general audiences.
They are mostly astronomers who want to use the observatories’ services to do their research. Gemini’s main service is providing data. Astronomers can either be aiming at getting data from an observing program or through the data archive.
Astronomers who want to get observations will visit the website during the different phases of Gemini’s programs. They are the Phase I (when astronomers explore the capabilities that correspond to their science requirements and write the proposal), Phase II (observing program preparation and execution) and Phase III (data access and data reduction).
Astronomers will also look for policies, tools and support, as well as information on their program, the instruments, the schedule, the weather, etc. They may be interested in looking for ways to send their student to Gemini or for opportunities to visit Gemini or for job opportunities. They are interested in the future capabilities of the observatory.
Other scientists visiting the Gemini website are instrumentalists. They want to build an instrument for Gemini, or to modify/improve an existing instrument, or to bring their own instrument to the telescopes. The builders are looking for opportunities (funding, call for instrument proposals, etc.). The owners of potential visiting instruments need contact information and technical specs about the telescope and the instrument support system. Finally, technology enterprises want to know how they can partner with the Observatory.
The general public is, by definition, very wide. It does not self-identify as “general” and a certain reflection on who they are is necessary before we tailor our labels and material. We have identified 5 main classes of the general public.
Teachers and educators: They look for activities to do with the observatory. They also like getting access to astronomical images.
Media: They look for Press Releases or videos, or want to organize a visit on the site. They also like astronomical images.
Astronomers: They are interested in the science made at Gemini. Also, they get announcements (e.g., shutdown, opportunities) and visit social media.
Astronomy enthusiast: They consume whatever science facts and news they can find. But they will be scared away if everything looks too technical. Also, they would rarely land on Gemini website by themselves, as they would have their science feed from other sources. Our challenge is to captivate them when they get to us.
Chilean and Hawai‘i communities: They want to learn about opportunities to see astronomy in action in their community. This includes, among other things, educational events and summit tours.
The observatory board, committee members and funding agencies need a protected area to post meeting information. They also browse the website to “see the Observatory”. They are interested in the science outcomes, ongoing developments and press releases.
In the 2010s, the Gemini website was not very different from any other major observatory website. The front page was offering a somehow united window to the different parts of the site, such as Science Operation, Outreach and Governance. Each of those sections were under the responsibility of the group they represented, had different architecture, and were not all supported by the same installation of the CMS. Anyone with publishing privileges could develop new pages, edits were unevenly moderated and there was nobody in charge of the website as a whole.
Consequently, the website was more like an ad-hoc collection of references with little to no connections between most pages, even within the same section. Even worse, the content suffered a lot of repetition, leading to long text and many occasions when a page was contradicting another.
Our objective was then to move to a more user-centric website that would welcome any visitor in a meaningful way, guide them to the information they care the most about and allow a consistent navigation scheme across the whole site.
The main objective of the project was to restructure the website at all levels (not just the front page), so it better addresses the needs of our audiences and stakeholders. The scope of the project did not include significant development of new content or graphical elements, which are limited to what was needed to reach the main objectives.
The CMS was purposely not upgraded to the latest version, and instead the effort was focused on usability with the presentation layer.
A data-driven approach is a robust way to avoid falling into the trap of developing a website around the preferences of a small group of people. Here, we present the common industry practices that were used to probe the Gemini site audiences and to collect quantitative measurables that could be used as specs for the final website.
Our first step was to identify the most frequently visited pages. Our assumption was that they must be among the most relevant to the website visitors. We made sure that our assessment included those pages, and that their content would be well represented in the new structure.
Usability testing is the evaluation of a product or service by testing it with representative users. The goal is to identify any usability problems, collect qualitative and quantitative data and determine the participant’s satisfaction with the product or service. Typically, during a test, participants will try to complete typical tasks while observers watch, listen and take notes1.
To use the full potential of this method, we applied it to both the old and the new site and compared the performance to verify if our objectives were met. Each testing involved about 5-6 people from different audiences, i.e., astronomers, students and school teachers.
Each testing session was about 1 hour long or shorter and started with a short questionnaire about the tester’s web consumption habits, preferences in terms of web content and relation with the Gemini website. Then, the tester was presented with a small sample of pages. For each page, they had to start with a short description of what they see, and of how they would expect to use the page. Then, the tester had to execute a series of tasks, such as telling how they would look for information on a given instrument’s filters or the instructions about proposals submissions or the last news press release. A total of 3 Gemini webpages and 2 other pages from another major observatory were tested using between 4 and 6 different tasks. Using pages of other observatories allowed us to compare how Gemini’s pages perform compared to them for the same tasks, allowing us to identify good ideas and practices that could be implemented.
Each testing session was recorded (with the testers’ approval) and observed by a small team of Gemini staff located in a separate room. The observers are presented with a view of the tester’s face and another of their screen. They note down the tester’s reactions and keep track of how the tester behaved on the tested page. The philosophy behind the exercise is if the tester does not complete a task successfully, it is because the page failed them, and not the other way around. This allowed us to identify changes that were required to increase the website’s success rates and improve our visitors’ experience.
At the end of each testing session, the observers spend about 20 minutes to share their observations and to agree on the top ten observations that would need the most attention. As an example, the list of observations on the old site was:
The pages were too wordy, while a lot could be better represented by tables and figures.
The navigation was not consistent, as the navigation bar and blocks were changing from page-to-page.
Information on the staff was hard to find.
There was no clear way to understand to which audience some of the content was intended to, as some people would land on pages they did not expect to land on and did not know if they were at the right place.
The homepage menu behavior was unexpected. That was a result of trying to pack too much on it.
There were no standards on what size should a title be, what should be a title and how to highlight important content. This led to some confusion.
There was too much distraction, as on the old page, it was intended to expose the visitors to the content from the outreach department, but the latter would take too much space on all the pages and would confuse the navigation.
It was missing some basic functions.
The Job listing was too hard to find.
The calls for telescope proposals and progress reports on active programs were not in the expected place.
Usability testings give a lot of information and are very powerful. But they are also very demanding, as it requires a lot of organization, a significant number of staff involvement (especially from observers) and depends on the ability to find testers. There are ways to target one aspect of the website and to descale the testing to a few minutes, making it lighter and more portable.
One of the main challenges identified during the usability testing was the site’s navigation. We used the software TreeJack from OptimalWorkshop for this task. The tool displays a simple menu list, and when the tester clicks on one of the items, it expends the next level of the navigation tree. For each session, the tester has to navigate through the tree to complete ten tasks. All the items of the navigation tree need to be collapsed at the start of each task, so the tester starts from scratch every time. There are 4 different grades for each task. The task could be a:
direct success (the tester went straight to the expected link),
indirect success (the tester reached the expected link, but needed to back track),
direct fail (the tester ended on an unexpected link) or
indirect fail (the tester wandered around and landed on a semi-random link).
Success rate is measured as the fraction of testers that completed a given task successfully (directly or indirectly), yet a direct success is preferred.
That test could run on a tablet, and it took less than 10 minutes. We then performed it at conferences where we could get help from up to 50 people from most of our audiences (e.g., astronomers, educators, instrument builders). We tested the old site’s navigation as well as the new one to compare performances.
A small number of quick interviews were used to decide on specific things during the development of the new website. They were simple questionnaires, sent over an online form or asked over the phone to volunteers. They are quick and simple to organize, but need to be very targeted to be useful.
Collecting information from the website audience through interviews and testing is extremely valuable to successfully make a user-centric website. On the other hand, some questions can be better answered by the observatory staff themselves. Moreover, since they will be responsible for the maintenance of the site, it makes sense to make sure they are engaged early with how it will be organized.
An efficient way to engage staff is through topical workshops, targeting an issue or a challenge that needs to be solved. We used 2 different types of workshops. One was a grading game, the other was a sorting game.
Websites are never really all good or all bad. There are some good things that we want to keep when they work, and others we want to change as soon as possible. To identify which was which on the old pages, we printed a few of them on a poster, and asked volunteers from the staff to comment on them.
The volunteers are given 2 piles of post-its of different colors, one green and one red. Then, you ask everyone to write down up to 3 things they like and 3 things they don’t about the page. Together, we place the post-its over the parts they concern, and discuss them. At the end of the workshop, we list down what solutions we would favor to address the dysfunctional parts without affecting what works.
Sorting and labelling content can be quite challenging and subject to individual taste. Yet, it is important to get it right when we structure the navigation tree and sort the website’s content, if we want to reach a high success rate with our visitors.
To decide how to organize the content of a big section (like a subset of pages about an instrument) or which pages go under which main menu items, we used the sorting game. The first step is to write on post-its keywords that represent the content; one keyword per content item. For instance, for an instrument, the keywords can be filters, gratings, resolution, pixel scale, data format, etc. Then, the volunteers group the keywords that go together in as many groups as required, and name the groups. The last step is to discuss each group separately, and discuss the relevance of the keywords they contain. If the group figures out how to improve the groups, like merging two, moving keywords from a group to another, etc., they simply move the post-its and rename the groups as needed.
The sorting game can be played on a table or a drawing board. The board has the advantage that the groups can be traced and modified as required.
Our data collection strategies provided us with a lot of information about how to move forward to replace the old site. Still, it is much wiser to create prototypes before moving forward with hard coding the new website templates.
We used Figma to mock-up the new content organization and page templates. The prototypes were submitted to usability testing to check if they were corresponding to our needs. It was not necessary to do extensive tests, as a few testers were enough to get a sense of what worked and what did not on the prototypes.
There are many good ways to build a new website that fixes the problems from the old one. Even after the testing and the workshops, we are left with choices that are more a matter of taste than a matter of understanding the data. And that is not necessarily a problem, as long as for the most part, the new website corresponds to what the data teach us, and it performs better than the old one.
In brief, for the Gemini website, we decided to keep a uniform environment across the whole site. There is no strong division between the sections that are addressed to the different audiences. That does not necessarily please everyone, as many are used to a simple button that brings scientists to a different division addressed for them. The challenge with that model is that the other audiences do not necessarily identify themselves as “Public” and most of the time they get a limited experience during their visit on the site.
The top menu bar and the footer of all the pages are always the same. That allows for a consistent navigation scheme across the whole website. It also exposes the visitors to the other parts of the site, yet it is doing so with a minimal use of real estate. That is specially useful to astronomers who want to use the telescopes, but are also teaching classes and are interested in the latest discoveries presented in the press releases.
The labeling of the menu bar was designed to guide the visitors to where they really intended to go. Those items are:
Observing (intended for users of the observatory)
Instrumentation (intended to both users and instrument builders)
About (for everyone)
News (for everyone)
Gallery (images and videos for everyone)
Learn (intended to the so-called general public)
Events (to promote science venues organized by the observatory and/or where staff is attending)
Children pages of those main sections are organized based on the expected behavior of the visitors.
Under Observing, most of the content is organized based on the different phases of a Gemini observing program:
Phase I, proposal for telescope time
Phase II, preparation of the observing sequences and active programs information
Phase III, data reduction and user feedback
Additional information that helps making decisions or getting help during any of those phases are also grouped under Observing.
Under Instrumentation, PIs can find information about all the current instruments. That content can be useful at every phase of an observing program. Other pages, such as retired instruments, future instruments and instrument programs are relevant to both PIs and instrument builders.
That sitemap is responding to Gemini’s needs. Another observatory could converge on a different structure. What matters is that the site is built around the data collected when the different audiences of the site were consulted.
In order to make sure that the changes we were making to the site had a positive impact, we regularly used performance comparisons. When quantitative data were not available, we reverted to qualitative information, such as comments shared with the observatory in various instances.
How would you look for information on how to apply for telescope time?
How would you look for job opportunities at Gemini?
How would you look for information on how to bring a visiting instrument to Gemini?
What are the geographic coordinates of the Gemini telescopes?
Science main page
How would you look for information on our astronomy staff?
What instrument is installed tonight?
How would you look for information on how to visit the Gemini South facility?
What is the extinction in magnitude when the cloud cover is CC70%?
How would you look for the total time observed of a 2016B program at Gemini North?
Who is the Gemini South Head of Science Operation and how can you contact her/him?
Name all the instruments that could be used for mid-infrared imaging in the 2016B semester.
How do you look for information on how to prepare your phaseII (OT)?
How would you look for information on data reduction information?
What filter should be used with the HK grism?
How would you go from this page to information on the AO system?
How would you file a helpdesk ticket from this page?
Comparison of usability testings performed on the old (2016) and prototypes (2018) of the Gemini website. The first column shows the tasks that the testers had to execute. The second and fourth columns show the average time (Time) it took the testers to execute the tasks for 2016 and 2018, respectively. The third and fifth columns show the completion rate (Comp.) of the tasks for 2016 and 2018, respectively.
Unfortunately, we could not afford a final quantitative assessment of the website after the launch, due to resources limitations. Yet we can compare the performances of the original site and an intermediate prototype from 2018, as measured from the recordings of our usability testings (see Table 1). In both testings, we made sure to reuse a large fraction of the questions, allowing us to compare measurable values one-to-one. We chose the time to complete a task and the tasks completion rates as the values to compare. The time to complete a task is defined as the time between when the tester hears the end of the question related to the task and when they either act on the site in a way that would complete the task or explain how they could complete the task. If the tester did not find the information related to the task or ended up in a wrong part of the site, the task was considered incomplete, and no completion time was recorded. The completion rate of a task is the percentage of testers that could complete the task successfully, regardless of the time required. The values presented in Table 1 are the average from the measures from all the testers for a given usability test.
The completion rates of the tasks on the old site, as tested in 2016, were in great part fairly satisfactory. More than half of the tasks could be completed by most or all the testers. There were some shortcomings that we targeted in the development of the new site, yet most users could find what they were looking for only by using the site, rather then searching the information on Google or emailing a Gemini staff. On the other hand, the time required to complete a task was often around a full minute, and rarely anyone has that much patience when they search for critical information on a website.
The performances of the 2018 prototype were in most part largely superior. Completion rates raised up to 100% in the vast majority of the tasks, with a few cases that were still reaching a high rate. Most importantly, the completion time decreased significantly, with the most simple ones taking a few seconds to complete. The expectation is that the Gemini users can get the information they need more efficiently, which should improve their experience with using the observatory for their work.
Since our main motivation to change the Gemini website was to improve the users experience, we explored other data related to Gemini users in search for signs of the impact of our project.
Gemini Helpdesk ticket system is design to collect questions and requests from users and to direct them to the appropriate staff. The many categories covered by the tickets cover the whole range of the observatory activities, from Phase I to Phase III, including instrumentation, operation, etc.
One assumption is that if the website change had a dramatic impact in the way users find critical information, it would change the number of tickets received through the Helpdesk system. Nevertheless, such change cannot be significantly detected. There are many trends in the number of tickets received, and each topic may have its own fluctuation. Yet, the total number of tickets have been decreasing since 2012, and the trend observed between 2016 and 2022 remains constant. Consequently, we can say that if the changes in the website did not stop the need from users to ask for help to the staff, it did not cause a significant increase either.
Our analysis does not include a study of the evolution of the content of the tickets between 2016 and 2022, as they contain private information, and such effort would be outside of the scope of this work.
The Gemini Science User Support Department has begun in 2017 a direct dialog between the Observatory and its users by sending out routine Short Surveys (2–3 questions) at every critical phase of Gemini’s user programs. The compilation of the responses provides a rich feedback that guides the decisions about how to improve operation and increase the efficiency of the systems. Every survey contains an open box for comments. Respondents are encouraged to express their opinion on anything related to their experience, and the website is explicitly mentioned as one possible topic in the question form.
Interestingly, before the launch of the new Gemini website, it was rarely mentioned by the surveys respondents. In the first 4 years of the surveys, it was mentioned 34 times, including only one positive comment, 3 neutral comments, and 30 negative comments! In comparison, since the launch of the site in April 2020 and until the writing of this publication (spanning over 2 years), a total of 40 comments were collected. The fraction of positive, neutral and negative comments are more balanced, with 8, 12 and 17 comments, respectively.
The launch of the new site triggered a significant increase in the mention of it in the surveys respondents comments. And, hopefully, the increase came mostly from positive and neutral comments. Many of the neutral and negative comments provide useful suggestions for further improvements. Those are mostly focused on the organization of the content on the Instruments pages and the pages describing the procedures to ask for telescope time and to prepare observing sequences.
The new website was well received from many of the different Gemini governance committees. The Gemini board and the board of directors have expressed their satisfaction with the improvements made on the website. Yet, more eloquent was the comment from the Users’ Committee for Gemini Observatory (UCG). The UCG provides feedback to the Observatory on all areas of its operations that affect current users of the facility, based on the experience of the committee members as well as input collected from the larger community of Gemini users. On their 2021 report, they wrote:
The UCG thinks that the new website is well developed, and it is now much easier to find the information that one wants. In particular, we appreciate that all information relevant to an instrument is grouped together. The UCG members thank Gemini for the nice transition to the new Gemini website.
That testimony made by a committee made exclusively of Gemini users is a good indicator that the work went the right way.
The launch of the new website structure in April 2020 was clearly a major milestone. Yet, it was not sufficient to guarantee a sustainable site that will always adequately adapt to changes and resist any uncontrolled drift towards the same issues that were present before the restructuring project.
That is why something else, just as important as the improved website, needed to be established: an observatory-wide strategy that guides the work on digital products, such as the website, but also mobile apps, social channels, and any other Internet and Web-enabled products and services. This strategy is supported by the Digital Governance (DG).
DG is a framework for establishing accountability, roles, and decision-making authority for an organization’s digital presence . It is about knowing who creates a set of policies and standards that guide the work of the digital team .
DG mitigates risks such as misalignment of the observatory’s strategy to digital products, uncertainty of authority and roles for digital products, lack of policies to identify digital risks, and lack of standards to ensure quality for digital products.
The Gemini Digital Governance framework is based upon the book “Managing Chaos: Digital Governance by Design”, written by Lisa Welchman.
Two DG teams were implemented at Gemini. There is the Core Team (CT), which makes, maintains and enforces the policies and standards. They are composed of members with decision authority from all the groups represented in the digital products. They meet every month to work on the policies and standards, make required changes and maintain the observatory’s strategy. There is also the Community of Practice (CoP), managed by the CT, which is a forum open to everybody at the observatory. It meets quarterly and covers 3 topics:
Requests from staff and issues with digital products, procedures, etc.,
Showcase of new techniques, procedures, social media, applications, and
Updates from the CT.
There are volunteers from each of the groups represented in the production of digital products. Their role is to participate in all the meetings, raise awareness of the CoP and DG updates within their group and encourage their peers to to bring up questions or requests.
Digital policy manages risk and ensures the observatory’s core interests are served as it operates online. Policy documents are usually short. They present:
the purpose of the policy: a brief sentence telling why the policy was necessary;
an overview of the document’s content: a one to two sentence summary;
the scope of the policy: what the policy applies to;
a list of related documents: other policies, standards, guidelines, etc.;
definitions of the terms introduced for the policy;
the details of the policy: a list written in an assertive style; and the first item usually enforces compliance with a related standard document;
responsibilities: a detailed description of who is responsible for what; and
enforcement: stating what are the consequences for a failure to comply with the policy.
Digital standards are used to ensure optimal digital quality and effectiveness. A digital policy establishes the purpose and requirements of a digital standard. Standards must have an associated “parent” policy. Digital standards contain the following information:
the scope: what the standard applies to;
a list of related documents: parent policy, related policies, standards, etc.;
a description of the roles and responsibilities; and
the details of the standards.
It is important to limit the text to what must be done, and not how. For example, it should not say that the CMS Drupal should be used, but it instead describe all the capabilities required. The choice of CMS is then guided by the standard, and a change of tool can be done without having to adjust the standard document.
Every document must contain the rationale and context for the document; an identification of a custodian and authorship; the date of original publication and date of next review; and a change log of modifications or review.
For maximum efficiency, detailed procedure documents are produced following the policies and standards. They describe step-by-step what should be done to achieve a given task, and should be read like a user’s manual. They ensure a systematic compliance with the digital policies and standards.
Modifications of the policy documents have to be approved by the CT and the observatory’s directorate. Modifications of the standard documents have to be approved by the CT. Modifications of the procedure documents can be made by anyone, and are better supported when they are done as a CoP exercise.
The standard for website maintenance applies to the creation and modification of pages and content in the Gemini public website (www.gemini.edu). It defines who are the:
Author: someone who is creating or modifying content;
Page owner: someone who is responsible for a given web page and its content;
Moderator: someone who reviews new pages and new content change requests for compliance with standards before they are published; and
Visitor: someone visiting the Gemini public website.
Governance: Every page must have an owner. The standard document describes the assignment of owners. There are at least 2 moderators per website sections.
Content management: Content of the whole Gemini website must be actively managed. The standards document describes what must be contained in the procedure document and the reporting system (for broken links, misspelling, etc.).
Page creation and editing: The standards document describes the capabilities that must be offered to the author and the responsibilities of the author (what they must do).
Moderation and publication: Any new content must be moderated. The standards document describes how much time a moderator has to review a publication request and who must be notified of the moderating decision. A page owner can request the moderator to revert to a previous version of a page they own.
Content review: A report presenting a quality assessment must be sent periodically to page owners on the pages they own. The reports are sent on a frequency that corresponds to when the content needs revision. For instrument pages, it is a month before there will be a call for proposal. It ensures that the content will be up-to-date when that section of the site will get its highest traffic. For pages with basic information, it is sent yearly, to avoid content staleness.
Most of the heavy lifting in the work described in the article was done by two people. It shows how, technically, the key to success is not to dedicate a large number of staff on such a project. It is rather two other things: the engagement of the observatory to mobilize additional staff for targeted efforts (few hours at a time) when required (about a dozen times through the project) and the engagement of the observatory to establish a completely new way to govern its digital presence. It is essential to accept to decentralize the responsibilities over the digital products, so it is distributed over the DG teams rather than being in the hands of one or a few individuals.
Establishing a new website structure, enforcing new standards on the publication of content and running a new DG requires a lot of patience. It took us at Gemini about 2 years to reach a cruising speed. There is a large number of staff that need to understand the purpose of the changes, that have to be trained to the new ways and to contribute to the DG teams in a constructive way. We are now at a state where most of the fundamental policy and standard documents are written, the website is routinely maintained and changes happen in a coherent way.
The gamble is that the benefits are greater than the cost. The cost is hard to evaluate. It can be seen as small, since the project team was tiny. But almost everybody in the observatory was involved at some point, whether because they momentarily helped or only because they had to be trained. And the time spent to get adapted to the new tools and procedures is hard to assess. Yet, we can already see a positive return on the investment through different performance indicators, as presented in this work.
As it turns out, the Gemini website will have to go through a major change pretty soon. It is only the result of the recent unification of all the US ground-based optical and infrared observatories under the umbrella of the NSF’s NOIRLab. The timing was such that the restructured website was launched on the same year NOIRLab was created, changing a lot of the conditions that should dictate how the content should be presented. The most important change required is to remove the so-called outreach part of the site to unify it under the new domain www.noirlab.edu. A future change may also involve migrating or reforming the design of the so-called science part of the site. Yet, it does not mean that all those efforts were a waste of time. On the contrary! Restructuring the site was a great opportunity to gain back control on the website’s content. We know how many pages there are (a number 10x smaller than the older website), we know where the critical information is, and we know how to present it. We know our audience better, and we have improved our dialog with them. Finally, now that we have a DG, we have a suite of documents that describe in detail what we need to preserve, what specs any new CMS or editing tool should have, and how visitors should be guided to the relevant information. In other words, we are in an ideal situation and ready for when sites need to be merged and/or unified.