Tales from the crick: The art of demo

Equipment demonstrations (demos) play an important role in the evaluation of new systems. As well as the excitement of exploring emerging technologies, a well‐organised demo can help guide procurement decisions and support funding applications. However, it is easy to underestimate the substantial effort required both before and following the demo to maximise its potential impact. Here, we discuss how our approach to demos at the Crick Advanced Light Microscopy Science and Technology Platform (CALM‐STP) has evolved over the last few years, emphasising the importance of a documented approach that combines quantitative with qualitative comparisons and engages with your user base in order to build up support for any potential system purchase.

each other in order to support a diverse user base from many research groups. 1 Each research study requires a customised approach as there is no one-size-fits-all microscope.For example, prioritising reduced phototoxicity for live cell work means compromising on other aspects of the imaging such as signal-to-noise ratio and/or resolution (Figure 1). 2,3esting new systems through equipment demonstrations (demos) is a useful way to evaluate how a system can be integrated with existing equipment and provide benefits to users and advance science.

WHAT IS A DEMO AND WHO ARE THEY FOR?
Equipment demos are an important part of horizon scanning: the identification of current and emerging technologies to guide purchase decisions and support funding applications.They can range from small equipment demos The tetrahedron of frustration is commonly used to demonstrate the concept that when optimising acquisition settings for one imaging parameter the others are compromised. 2There is no one-size-fits-all microscope.It is important to understand the priorities for each experiment and what compromises are acceptable.In this quaternary plot, 3 imaging experiments ('live' or 'fixed') with different priorities ('resolution' for high resolution, 'gentle' for minimising phototoxicity, or 'general' with no specific priority) are plotted in 3D space.For example, a live cell experiment will need to compromise on resolution and signal-to-noise ratio if it is to be sufficiently gentle, whereas a project prioritising spatial resolution in fixed samples (such as single molecule localisation) may relatively unconcerned about the other parameters.such as a new camera or other component, as part of an upgrade to an existing system, up to a demo of a complete system.They provide an opportunity for companies to showcase new technology to potential new customers and also to gain experience with a diverse range of different sample types that can provide valuable feedback to further aid in product development.
Research groups will benefit from trialling and testing new equipment during demos, having the opportunity to explore new ideas, consider different approaches and try advanced techniques.As well as gaining hands-on experience with the systems, they can also gather preliminary data for grant applications.
Core facilities will gain a better understanding how new equipment might facilitate future experiments while continuing to support ongoing experiments and help them to make informed purchase decisions.

Types of demo
The different types of demos are summarised in Table 1.
A common type of demo is to visit vendor stands at conferences and see their latest products.While this can be appealing to suppliers as they get a lot of exposure for a sin-gle system installation, there are limitations to the scientist in terms of sample use and access to the system.Alternatively, suppliers may suggest a visit to an already installed system at an existing customer's facility or imaging centre.This offers advantages to the supplier in terms of reduced cost, while being confident that the system is already fully functional.The scientist can often arrange for their samples to be sent to the site (or travel with them), enabling evaluation of the system for their specific needs.However, site visits are limited to just a few samples, only a couple of research groups, and are subject to the availability of the host institute's system.Site visits are also not necessarily suitable for live, fragile or infectious samples.Site visits can be done remotely, which has advantages in terms of convenience, and be useful if an in-person demo is not a viable option, but provide limited hands-on experience.
For the most rigorous testing of a system, an on-site demo is the gold standard.As well as extensive testing of the equipment with a variety of real samples, running pilot experiments and getting hands-on experience, the likely level of ongoing support from the supplier can also be evaluated.Building user support for a purchase decision is an important aspect of any equipment demo, and generation of preliminary data is the best way to demonstrate that a system can improve a research group's experiments.

Cost
There are significant costs associated with on-site demos, for both the suppliers and the host institute, that need to be acknowledged.The supplier bears financial costs of transporting and installing the system for only a brief time, as well as the cost of work hours for engineers, application specialists and sales staff, before considering the costs of the equipment itself.The core facility invests time and effort in preparing and coordinating the demo, as well as advertising and recruiting research groups (local knowledge of current projects and the requirements of research groups being highly valuable).There is a cost to the host institute for the work hours of support service teams such as logistics, facilities and IT, who may be required to transport the system within the building and connect to relevant utilities.In addition, there are costs to the researchers who prepare and provide the samples.It is important to consider all these costs when planning a demo, and make sure it runs efficiently and effectively so that the overheads can be justified.A poorly organised demo without suitable samples and interest from researchers can be a waste of time and resource for everyone involved.While from the supplier's perspective the success of a demo can be measured by the number of new leads and sales, for the scientists and facilities, it is important to learn from all demos -if a system is not suitable at this moment, further development may lead to significant improvements, or it may be more suitable for future experiments.We will discuss here some of the demos we have run at the Francis Crick Institute, highlighting what went well but also what lessons we learnt from demos even if they did not lead to a purchase.

ESTABLISHING THE CALM-STP
Over the last 8 years since the creation of the Advanced Light Microscopy Science and Technology Platform at the Francis Crick Institute (CALM-STP), we have conducted over 60 demos (excluding vendor stalls at conferences).These have consisted of a mixture of on-site demos, external site visits and remote sessions, looking at complete systems as well as small equipment to upgrade existing systems.
As the CALM-STP is a self-service core facility, providing training and support to scientists to enable them to use the equipment independently, the systems we offer need to be accessible with a strong emphasis on usability.Highest priority is given to the equipment we think is best performing in its field at the time, independent of supplier (acknowledging that suppliers have independent development cycles).Additionally, we have an increasing number of systems controlled with open-source software, 4 which enables integration of small equipment add-ons onto existing systems when appropriate.

Aims of a demo
The funding context will influence the nature of the demo: if a research grant has already been awarded then the remit of the equipment is clear and the demo can be focussed on a sample or technique, as specified in the grant.In this case, the aim of the demo(s) is to identify a preferred system or final specification.If funding has not been secured, demos can serve to build up a business case and/or generate preliminary data for a grant application.
Maintaining good relationships with suppliers is essential.As the number of demos will always exceed the number of systems purchased, it is important to be open about the intentions of the demo.Having a mind-set of an ongoing collaboration with the suppliers can lead to continued access to new technologies and opportunities for future demos.In most cases, our approach involves arranging demos of similar equipment in groups to enable comparisons between related systems with similar samples.

Case Study I: Light-sheet microscopy
One of the first series of demos that we ran focussed on light-sheet microscopy which was an emerging technology at the time, particularly for commercially available systems. 5 Following a review of the facility, light-sheet microscopy had been highlighted as an essential technique for the new CALM-STP.As light-sheet microscopy is a broad field with different systems more suited to certain samples than others, the aim was to identify the most suitable system for a majority of facility users.
Seminars from suppliers beforehand were useful for gauging interest from research groups and also to give the facility staff some technical information on how the system works.It was arranged for two systems to be running headto-head with the plan to image the same sample on the different systems, which was highly impractical as running simultaneous demos is very difficult and further complicated by the many different methods of mounting samples on light-sheet microscopes.
After purchasing the first light-sheet microscope, the initial rate of uptake by scientists was slower than expected (Figure 2a).We have learnt that there are a variety of factors to consider when setting expectations for new systems.The slow rate of uptake may have been due to a steep learning curve for mounting samples onto the microscope, the significant image processing required, or just a general factor to consider when introducing a new technique to a facility.However, due to the dedication of the system manager, the microscope usage increased and logged over 6500 h usage in its first 5 years, from 25 different research groups (Figure 2b).
With new groups joining the institute, the light-sheet needs evolved and the requirement for sample multiplexing (e.g., multiple samples and/or multiple conditions) increased.With funding secured for a new system, we ran another series of demos specifically for light-sheet systems that were compatible with sample multiplexing and live imaging.
As before, companies gave seminars a couple of weeks prior to the arrival of their microscopes to build up interest in the building and to ensure the sample preparation to be the most appropriate for their system.This time, we organised for systems to come to the facility consecutively and, to enable the comparison to be as fair as possible, we aimed to image a range of 3D samples varying in size and light sensitivity on each system.As a result of our experience with the previous light-sheet demos, ease of sample mounting and software usability were guiding factors in the purchase decision.The level of support we had in the lead up, during and after the demo of the micro-scope was also important.The uptake of the new system by research groups has been faster than that with the original (Figure 2).

Case Study II: Spinning disk confocal microscopes
Initially, the CALM-STP did not have a spinning disk confocal system.Consequently, at the cost of reduced throughput and increased phototoxicity, a lot of live cell imaging was being done on point scanning confocal microscopes that would have been more suited to a spinning disk (or other array scanning) system.To address this, we arranged for a series of spinning disk/array scanning confocal systems.
Based on active projects in our user base and the research groups that would benefit from this technology, we were able to define a list of requirements for the new system before arranging the demos.Our priorities for the new system were: • Optical sectioning and confocal image quality with high contrast • Focus on live cell imaging, particularly with mammalian tissue culture cells and Drosophila as expected samples • Increased throughput compared to point scanning confocal microscopes • FRAP (Fluorescence Recovery After Photobleaching) and photo-stimulation capabilities Ultimately, the system we selected performed well on every point.The fast rate of adoption by researchers (Figure 3) highlighted the need for this technology, and the high levels of use (>16,000 h over the first 5 years, from 42 different research groups) led to additional spinning disk purchases that have shown an equally high level of use (Figure 3).
However, many systems were similar, if not identical, in terms of hardware and image quality.What influenced our final decision was ultimately other aspects of the demo that really stood out.
One contender that was excluded early on in the process was a system that was never functional.While equipment can fail and a certain level of leeway can be afforded for demo equipment that is often well travelled and abused, ongoing system failure means its performance cannot be evaluated.It is important to emphasise that a demo tests both the system as well as the support and sales team.In this instance, the supplier was unable to compensate for the lack of system functionality.On the other hand, if multiple systems are technically sufficient, the supplier's ability to coordinate equipment and engineers can be a deciding factor in the purchasing process.Another important aspect is software integration of all components.When systems scored poorly on software and usability, it was generally because of an unfamiliarity with the software, but there was an instance when not all components were fully integrated into their acquisition software.It is worth noting that the system we ultimately purchased did have software that was a new addition to the CALM-STP, so unfamiliarity is not a hindrance, but functionality is crucial.
Finally, demos can make you aware of systems that you may not otherwise consider.One system tested did not fit the remit for the preferred system in this instance, but it did outperform the other systems in specific experimental conditions.In this instance, we were wanting a more adaptable 'workhorse' rather than 'racehorse' system.However, even if a system does not meet the current criteria, maintaining good relationships with suppliers will ensure there remains the possibility of future demos.In this case, we returned to the system about 2 years later in the context of a different demo series and ultimately purchased it for the emerging projects.

3.4
Case study III: Super resolution After a review of the STP, we were advised to improve our superresolution capabilities.However, as the term 'superresolution' can describe various techniques capable of imaging below Abbe's diffraction limit, we needed to understand the needs of our user base at that time, as well as anticipate potential future needs as projects evolved and new research groups joined.Presentations from suppliers to introduce the technologies helped to measure the level of interest from research groups and revealed a dilemma: while some research groups may benefit from a slight improvement in resolution, a specialised single molecule localisation microscope (SMLM) would benefit fewer users initially but offer an advanced technique distinct from our existing systems and thereby facilitate more novel studies.
Our final decision to purchase a dedicated single molecule localisation system was based on the rationale that a single purpose high-end system would have a higher impact on the few projects that required such a technique, compared to a multifunctional system that may not perform as well.Initially, the system was used as expected, with a couple of projects from different labs.Unfortunately, the system experienced reduced usage during the pandemic and has not returned to prepandemic levels (Figure 4).
Following the arrival of new research groups interested in imaging very small objects (e.g., vesicles, mitochondria) in live cells, we decided to return to looking specifically at systems capable of live cell imaging at sub-200 nm resolution.Unlike the previous round of superresolution demos, we had a clear list of priorities and research groups interested in the new technology.This enabled us to have a good selection of samples to use on each demo and make informed judgements on image quality, light dosage, as well as some quantifiable factors such as speed of acquisition (Figure 5).
7][8] While it may not have achieved the highest resolution among all tested systems, it offered other benefits such as faster acquisition speed and usable unprocessed raw data.The take-home message from this set of demos was to emphasise the importance of having a clear list of priorities when organising demos, consider the bigger picture of how the system will be used and the likely types of experiments that it will facilitate.Ultimately, this led to the purchase of a system that is increasingly heavily used (3500 h, 50 research groups in its first 3 years) (Figure 4).

Small equipment
Evaluating small equipment has many of the same steps as a full system demo.In our experience, there are two categories of small equipment demo: (1) speculative from suppliers who want you to consider their product the next time a full system is purchased (e.g., cameras, light sources) and ( 2) specific project requests to add functionality to an existing system.The main difference between small equipment and a full system demo is the need to consider how any potential upgrade will affect the current system -will it integrate with existing software and what are the likely conflicts with current users of the system?

PERFORMANCE EVALUATION
The main purpose of equipment demos is to evaluate the ability of equipment to perform planned and/or expected experiments.Although a quantifiable measurement for saying how good each system is would be ideal, the reality is that all equipment will need to accommodate a variety of different samples and be used by multiple people in different ways.It is impossible to predict how equipment will be used in the future; therefore, any judgements made from equipment demos need to be based on feedback from research groups who attended the demo, your own qualitative assessment of the data produced and system usability.Time is limited on any demo system and in our opinion, testing a broad range of samples is more valuable than attempting to quantify every aspect of performance (the latter may bias systems that can image standard test slides very well but underperform on diverse biological samples).Booking and feedback forms for the demo are a useful tool for documenting the requirements and opinions of attendees (see supplementary data for example templates).While certain aspects of the equipment can be quantified, such as design and hardware (e.g., field of view, acquisition speed, Figure 6d), rigorous comparisons between different systems in uncontrolled conditions in a demo situation are impossible and at risk of bias.In the end, the question is this: is the equipment suitable for the research interests of your user base?

Software and usability
As well as evaluating the hardware, it is important to consider the software that is used to run the equipment.In many cases, the software determines the usability of the system and how steep the learning curve is likely to be for users who are unfamiliar with the system.Except for facilities that have a single supplier, it is important for facility staff to have a broad range of experience with different acquisition software.At the CALM-STP, we attempt to be supplier agnostic.However, we each have personal preferences and, as with users, it generally correlates with the level of experience we have had with each software.A key aspect of any demo is to gain experience with different software and understand which, if any, image processing that is required with that system (e.g., image reconstruction, data export for analysis in third-party software).

Service and support
The final insight that can be gained from a demo, is evaluating the performance of the supplier themselves.Their organisational competency and enthusiasm will likely reflect the level of ongoing support after purchase.It is important to maintain good relations with suppliers, as technology development cycles vary and if a system does were made based on the parameter-free image decorrelation approach using the Image Decorrelation Analysis FIJI plugin. 6ll images were subjected to a median filter (radius = 2) before running the plugin.
not perform as hoped, a later model might.Similarly, the interests of research groups also evolve, and a system may become of greater relevance to them after a period of time.

Demo debrief
A few weeks after each demo, it is important to have a review of the demo to discuss the data.The supplier or core facility can create a brief overview of some highlights and attendees can feedback any criticisms that arose.It is our experience that without the demo debriefing, many terabytes of data generated by the demo is never reviewed, whereas a short report is easily accessible and can be returned to in the future.Requesting a quote for the system as demoed (not a best and final offer) at this time is also sensible to get an idea of the cost of the system.

SYSTEM COMPARISONS
The comparison of multiple systems needs to be done with caution and a consideration of the procurement process.
Ultimately, the aim is to get the best value for money on any purchase.This does not necessarily mean getting the cheapest system but is the ideal balance between the commercial terms and the quality of the system.How to assess the technical ability of a system and give a quality score requires an understanding of the strengths of the system and the needs of the research groups.A successful demo will provide you with enough preliminary data, documentation of interested research groups and potential applications to be able to establish a rationale as to why a system is preferred.To help rationalise the comparison process, a system evaluation form is included in supplementary material.An important point to emphasise is that the quality score will not be a quantification of how good a system is per se but reflect your confidence that a system will be used, perform sufficiently well to support the experiments of your research groups and justify the investment.

Defining a system remit
Understanding the needs of the local user base, which samples are likely to be used, as well as current and potential future experimental requirements are the important factors when defining a remit and minimum specifications (the quality element) for the system that is to be purchased.The performance of a system during a demo can then be evaluated on whether a system fits the remit and scored for how well it will do the task.
Once the quality element has been agreed, comparison between multiple systems requires an evaluation method to be able to assign a score.Certain technical specifications of the hardware are directly comparable (e.g., field of view at a certain magnification, available channels).However, most properties are variable and depend on the experimental setup and application space.Comparisons need to acknowledge the variability in the experiments that the system will likely be used for.For example, acquisition speed needs to be considered in the context of field of view and image resolution; phototoxicity levels can be affected by exposure times and laser power, which in turn could be affected by pixel sizes and sensitivity of the detector used.If a system is to be purchased to primarily perform in a single application space, then the remit can be very specific and it may be possible for systems to be directly compared; most system purchases require some level of multifunctionality and so do not allow for comprehensive quantitative comparisons.Following a demo, a record should be kept to document the technical reasoning behind the final decision process.

CONCLUSION
Evaluating scientific equipment should always be a part of the procurement process.There are significant costs involved when organising and running a demo, but, if done well, there are many benefits for all parties involved.The main lesson we have learnt is to not underestimate the effort required before and after the demo to make it worthwhile (Table 2 and Figure 7).While there are limitations to what features can be quantitatively compared between different systems, qualitative judgements are important for assessing confidence in a system.Utilising booking and metrics.Heavily used systems are of course important for cost recovery in facilities, whereas underused systems can be seen as a drain on resources and can be perceived as an expensive burden or 'white elephant'.A slightly more abstract concept, and difficult to quantify, is a system's contribution to scientific advancement: has it enabled experiments to be performed that were not previously possible due to a unique feature and specific application, or did it duplicate functionality of other systems and increase capacity?The balance between robust multifunctional workhorse systems and systems with a specialist application needs to be considered with any system purchase, which is therefore relevant when arranging to demo equipment: understanding the aims of the demo and how it can then help to inform any purchase decisions.There must always be a space for systems with highly specialised and unique capabilities, as they can have a high scientific impact on a project, but is it possible to avoid them becoming underutilised?As a tool to attempt to answer this question, an equipment demo should inform on the level of interest within the user base and also what level of specialist expertise is required to manage the system.A judgement then needs to be made on whether there are enough projects to support any purchase.
Another possibility is to utilise, or consider becoming a node for, a shared core facility 9 or international imaging network. 10

Experience to make a decision
How do you decide between the many imaging systems that are available?Obvious limitations include budget and system type, but there are still likely to be many options for any application space.It is the responsibility of experienced core facility staff to provide unbiased guidance and equipment demos provide an opportunity to gain valuable experience.

The ideal demo
Although at a basic level the purpose of a demo is to evaluate a system for a certain task, this only applies to single purpose systems with a specific remit.For multifunctional systems, a demo provides an opportunity to engage with your user base and challenge the system with many different samples and experimental goals, gaining insights into the strengths of a system as well as current and future requirements of the research groups that may use the system going forward.
The four stages of a demo are: initial planning and organisation phase involving discussions with suppliers, recruiting research groups and having a predemo presentation; setting up and running the demo itself; a debrief and round-up discussion with the supplier and attendees; finally, comparisons with competing systems (Figure 7).It is important to get feedback and document sample information, experimental requirements and opinions.Documentation that you can easily return to provides confidence in your judgement that a system meets the required specifications.Additional functionality and add-ons can distinguish systems, so from the feedback you can assess the potential impact of that function on the system's usage.Service, support and running costs can also be assessed.All these factors can then be put into the context of budget and negotiations to get the best deal from the suppliers, while building up a rationale for how you came to any decision.
In our opinion, evaluating potential new systems through demos should be an integral part of the procurement process.Whether deciding which system is most suitable for a core facility or individual lab, or gathering preliminary data for grant applications, a well-run demo will always be beneficial.

A C K N O W L E D G E M E N T S
This article is dedicated to all of our suppliers, without whom we would not have any equipment to demo.We have kept systems and suppliers anonymous, as we know how much effort everyone has put in and often to no sale.Hopefully any criticisms in this article are taken constructively.We love you all.Thanks to Kurt Anderson and Sonia Spitzer for critical reading of the manuscript and Barbara Clough for providing samples.The Francis Crick Institute receives its core funding from Cancer Research UK (CC1069), the UK Medical Research Council (CC1069), and the Wellcome Trust (CC1069).

TA B L E 1 *
Summary of the different types of equipment demo.Limitations on live and delicate samples.

F I G U R E 2
Comparison of system usage between new light-sheet systems.Although both systems have a similar application space (live cell, gentle imaging), the second system had a faster rate of uptake.(a) Monthly system usage (3-month moving average) during (up to) first 3 years.*Indicates start of COVID-19 lockdown.(b) Total annual usage during first 5 years.†Indicates less than 12 months.

F I G U R E 3
High rate of uptake for spinning disk confocal systems.(a) Monthly system usage (3-month moving average) during (up to) first 3 years.*Indicates start of COVID-19 lockdown.(b) Total annual usage during (up to) first 5 years.†Indicates less than 12 months.

F I G U R E 4
Super resolution system usage for a single molecule localisation microscope (SMLM) and an image scanning microscope (ISM).Implementation of the single molecule localisation microscope in a core facility environment has been more challenging than the ISM approach.(a) Monthly system usage (3-month moving average) during first 3 years.*Indicates start of COVID-19 lockdown.(b) Total annual usage during first (up to) 5 years.†Indicates less than 12 months.

F I G U R E 5
Super resolution demo system comparisons.EEA1-GFP expressing HUVEC cells, fixed with 4%PFA.Imaged on two demo systems capable of sub-200 nm resolution.(a) Structured illumination microscope (SIM): (i) overview image in widefield mode; (ii) inset widefield; (iii) SIM reconstruction.(b) Image scanning microscope (ISM): (i) overview image; (ii) inset unprocessed data; (iii) deconvolved image using Richardson-Lucy method, 20 iterations (NIS-Elements).Images are single z planes.Scale bars 10 and 2 µm in overview and inset images, respectively.(c) Resolution comparisons for these images; estimations

F I G U R E 6
There are limitations to what quantitative system comparisons can be made with an unbiased approach.FluoCells™ Prepared Slide #1 (Invitrogen).BPAE cells with DAPI (cyan), Phalloidin-488 (green) and MitoTracker Red (magenta).(a, b) Single z planes from image stacks acquired on two demo systems, specified for similar high image resolution and signal-to-noise ratio.(c) Resolution comparisons for these images; estimations for each channel were made using the Image Decorrelation FIJI plugin.6All images were subjected to a median filter (radius=2) before running the plugin, line shows mean.(d) Acquisition area and acquisition time are parameters that can be compared quantitatively, although other factors such as image resolution and quality need to be considered.For example, the choice of objective lens and camera can have significant impact and make quantitative fair comparisons unlikely.Scale bars 20 and 2 µm in overview and inset images, respectively.

F I G U R E 7
Flowchart for organising a demo.

O
R C I D Matthew J. Renshaw https://orcid.org/0000-0001-5238-9191RE F E R E N C E S