What technology do you use at work? How well does it work for you? And what tech are you hoping and praying will land on your lab bench or computer one day?
Technology scouting is the process of identifying gaps in a business that can be filled with new tech. Maybe it’s the manual copying-and-pasting of data that could be automated. Maybe it’s a cool experiment that could be done with a new instrument. Whatever the gap, finding a good solution requires a thought-through approach. How should you approach tech scouting to get the best outcome for your work and your organization?
Join us in Episode 2 to learn how to evaluate your technology toolbox.
Evaluating Your Technology Toolbox Transcript
Andrew Anderson 00:00
I mean, in a lot of ways, it’s the classic “I’m too busy chopping wood to sharpen my axe.” Thinking about your process. You know, that’s work. It’s work to self reflect. And if I’m busy chopping wood, it’s difficult for me to look at my axe.
Charis Lam 00:26
How much software would you say you use in your job?
Jesse Harris 00:28
I’d say a lot personally, a good deal of it. I know that there’s some software that I can’t live without. And then there’s some that I barely tolerate to get my job done.
Charis Lam 00:41
Same, and the question is, where does this come from? And what determines what tools you’re using and whether they work for you? And it comes from a process called software evaluation or technology scouting.
Jesse Harris 00:53
Welcome to The Analytical Wavelength, a podcast about chemical knowledge and data in the pharmaceutical, agrochemical, and related industries. Today, we’re going to be talking about technology scouting—figuring out how you might be more productive with the right tools and determining what the right option is.
Charis Lam 01:10
We interviewed Andrew Anderson, Vice President of Innovation and Informatics Strategy at ACD/Labs. Back in December, he wrote a blog post on software evaluation for the busy scientist. And he’s here to elaborate on some of those ideas.
Jesse Harris 01:24
We caught him at his home in California.
Today on the podcast, we have Andrew Anderson. I understand this is your third stint at ACD/Labs, Andrew.
Andrew Anderson 01:34
That’s correct. Yes.
Jesse Harris 01:36
And in between those, you’ve had a few stints at Pepsi, Symyx, and Pfizer. Is that accurate?
Andrew Anderson 01:42
Yes, that’s correct.
Jesse Harris 01:43
Yes, a nice variety there, which is lovely. How are you doing today?
Andrew Anderson 01:47
I’m great. Great. Looking forward to the discussion.
Jesse Harris 01:50
Good. We’re looking forward to having you. And one icebreaker question. We actually wanted to ask you: what is your favorite chemical or like a chemical that has a special place in your heart?
Andrew Anderson 02:00
Oh, man, I’d pick two. One is caffeine, certainly. I work an East Coast schedule. So it’s early mornings for me that are often caffeine-fueled. And the second is beta-alanine of all things. I enjoy being an amateur competitive weightlifter. And beta-alanine helps me get through. After those long East Coast days, I train at about 5:30 local time to about 7:30 local time, five days a week. So I need some fuel, and beta-alanine seems to help me get through my workout. So those are my two favorite right now.
Jesse Harris 02:40
Great. Caffeine is a pretty mainstream one. But that’s a lovely list.
Andrew Anderson 02:46
Charis Lam 02:47
Besides weightlifting, I understand that one of your interests is technology scouting, right, Andrew?
Andrew Anderson 02:53
Absolutely. Yeah, a topic near and dear to my heart.
Charis Lam 02:57
And that’s definitely something that we want to discuss today. So maybe let’s try a hypothetical scenario. Say that the three of us are a company in an analytical-science division in need of new software. Well actually, let’s take a step back from that question. How do we as a group know that we have any sort of need at all?
Andrew Anderson 03:19
It’s a great question. When I think of organizations, I think of systems. What I first try to look at is “what’s the overall goal of the system,” and then look at how that goal is influenced by my activities, whatever they are, right? So if I’m part of an organization, I certainly contribute to helping the organization reach its goal, right?
So in a traditional … put myself into my former shoes as an analytical chemist, what am I doing that contributes to the organization meeting its overall goal? So as an example, if I am an analytical scientist, I’m usually performing some sort of characterization of materials, substances, things that are being produced by the organization’s function, whatever it is. It can be in manufacturing; it can be in research; it can be in development. But my set of skills are truly the eyes on both substances and processes.
So when I look at the organization’s goals, I have usually two goals, right? I want to do something new, I want to produce something new. I want to make something that I don’t make today. Or think of that as the innovation part of R&D, for example. But then also, I want to look at my existing set of business processes, and presumably do them more productively. Faster, cheaper, higher quality, reduce risk, etc. Those are all called facets of productivity. And so when I drill down into whatever the organization’s business processes are, some sort of aspirational goal for improving them somehow. Either support for new business processes or making the existing ones more efficient.
I want to be able to look at then my set of tasks that help with those business process, and either do things differently, right, new things in the labs. Or do what I do currently more productively. The question from my perspective: “what are my set of tasks beyond just my traditional experiments,” but think of them as like non-functional tasks, whatever they are, and the classic example in the analytical lab is: I’ll spend time certainly performing analytical experiments, but then from that set of activities or functions, what am I doing outside of experiments? How much time am I spending in project meetings? How much time am I spending preparing human-readable summaries for the results of my experiments to proliferate through the organization, and help facilitate effective decision-making as an example.
So looking at not just my experimental tasks, but just all of my tasks, and looking at ways to reduce things that can be replaced by less human-intensive efforts is an important aspect of, first off, technology assessment. I always like to say to folks, you diagnose a challenge or a problem before you prescribe a solution. And so scouting, in a way, is dependent on good technology assessment. What are my needs? Before I get into looking for something, I want to really quantify the size of the issue, the factors which contribute to that size. And then, and only then, am I able to start looking for the explicit set of capabilities in some technology that I can scout for, find an effective partner to help implement that technology, and ultimately oversee its implementation and value realization. Does that make sense?
Charis Lam 07:20
Yeah, that makes sense. And you had some really good examples from an analytical scientist’s perspective. So would you say that it’s everybody’s job to be thinking about technology scouting in relation to their own work tasks?
Andrew Anderson 07:33
I would say the short answer is yes, absolutely. It depends on the role of any individual in an organization or institution. As we say in sports, sometimes, pick your shots. Pick the time when it’s appropriate to determine when it is good to change a process. The analogy being: you don’t want to be necessarily trying to fix your fuselage as you’re flying the plane. It’s important to know when is the right time when change can be embraced.
One of the challenges that I saw and continue to see in my career is: it’s important to determine when is the right time that change will be embraced in an organization. So looking at the spectrum of tasks associated with a change, right? In my life growing up as a former sales and business-development person, technology can be at times disruptive to processes. There’s lots of entrenched stakeholders, invested in processes. Especially in highly regulated industries, like the pharmaceutical industry, where in a highly regulated world, you have often very efficient systems, but making change to them? Those systems can be significantly disruptive and require a lot of appropriate change control. And so as a consequence, picking the right time and the right circumstance to embrace innovation, to embrace change, albeit disruptive, is a really important facet of any sort of technology scouting, licensing, assessment, work that goes on in various industries.
Charis Lam 09:26
And once we’ve identified a need, and we think it’s the right time for change, how do we know what sort of solutions we should be looking for? How do we know if it’s software or if we just need a new hire or we need to upgrade an instrument?
Andrew Anderson 09:41
Yeah, I think it starts with … the need assessment, for me, is a really important investment to make. In my career, what I found was the best technology projects that I’ve been involved in started with a significant investment in needs assessment and process mapping. Getting a clear understanding of what are the system’s needs to address productivity and innovation? Start at the highest level and work down to individual stakeholders in the network. Start with that, and then look at getting to the fundamental issue within a particular need area.
And I’ll think of an example. We’ll come across opportunities in my daily life where folks want to take data off of analytical instruments that are in various dispositions and formats, and want to facilitate use of the data that’s being generated. And oftentimes, with a lot of our stakeholders, they’ll want to think of the problem as “we have a reporting problem,” right? We have the need to take data that’s generated, like files that come off of instruments, and prepare human-readable documents that can be stored and managed in a document-management system. At the basic level, that is truly a need, right? We need the ability to use this data more effectively. And by leveraging the current technological paradigm in the so-called document-driven decision-making paradigm, the basic need is: take data, prepare a PDF, store the PDF somewhere, okay.
But as we all across the world, across industries, want to embrace so-called digital transformation, have a data-driven decision-making paradigm, the need changes a little bit. And so if you look at then, beyond the document, what if there are easily accessible software interfaces to review that data that transcend the various data sources, the sources where data is being generated? The software systems that manage acquisition? How do you take that data, decouple it from the source, create data-accessibility points that are easy to access? It kind of replaces the need to even prepare reports. So looking beyond the need with the current technological paradigm in mind, right, go beyond that, and then redefine your needs based on a new technological paradigm.
I think it’s paramount, because then it gets into, okay, do we need people? Do we need reports? Or do we need a system that really converges between software, instrument sources, humans that interact with them, and ultimately machines that interact with those systems? Machines being like computational systems, artificial-intelligence systems, machine-learning systems? You know, how does data free itself? Or using software systems to free decoupled data from those instrument sources that are oftentimes in proprietary formats? It really gets you to identifying the core need of the organization beyond the tactics of, you know, in our example, multi-technique reports being generated and stored in a document-management system. Does that make sense?
Charis Lam 13:35
Yeah, definitely. It’s good advice to decouple the true need from the tactics.
Jesse Harris 13:42
That transitions very well into some of our next questions about managing these gaps and identifying these gaps that you’re trying to fill. So you kind of mentioned earlier about being precise in the needs that you’re trying to fill. How do you quantify the cost or make the assessments of, okay, cost-benefit analysis, this is a gap that needs to be filled.
Andrew Anderson 14:04
Yeah. So, I’m a big proponent of business-process mapping. And looking at, across all of … You remember, we go from organizational goals to a collection of business processes that help us reach those goals. We matriculate through a process to arrive at a goal. And in the area that I’m personally fairly familiar with … Let’s pick drug discovery. Let’s play a little scenario model out. If you look at all of the tasks that are involved, the processes that are involved in drug discovery from target selection and validation, assay development, screening for lead-series identification, lead-series optimization over time, through the preclinical development process to effectively validate that your candidate series offers the appropriate potency, selectivity, safety, bioavailability. All of those different processes coalesce into a candidate nomination at the end.
So what I typically recommend is: itemize first your business processes, then define what are the steps involved in that process. And presumably, a lot of them are iterative, right? Especially lead optimization. You go through a round of … call it molecular architecture, to optimize molecules for various properties. You test whether your proposed and implemented changes have a modulating effect on a particular property. And based on a crop of data that emanates from testing, continue to optimize using some degree of structure-to-activity relationship.
So, getting to a set of, call them unit operations, even if they’re iterative, is the first step after business-process itemization. From there, what I like to look at is: who are the actors involved in that process? What are their tasks? How long does it take to complete their tasks, and what are the system dependencies required to address those tasks. Systems can be software systems, hardware systems, equipment, systems, consumable resources, material resources, etc. So if you get through, you know, the system accounting, the business-process accounting, and the tasks involved to support both, you can arrive at both time and material costs.
And then you look at, okay, now, if we want to implement a technology that addresses either the number of steps, the actors involved, the systems being used, you’re able to look at then a comparative change, either in time or consumable usage, or systems that have to be supported. And so that can ultimately arrive at your decision-making indicators.
For example, if a technology implementation costs a certain amount of money and time, you can compare that cost to the impact on business process and the value that that impact creates, and thus, provide a financial metric, like return on investment, or a payback period, or net present value that allows you to make decisions. And if you take that methodology to all of your business processes and all of your prospective changes to business process, you’re able to get almost like a rank order comparator for any technology investment you intend to make.
I spent a lot of time in my prior life at PepsiCo, looking at that portfolio, right? Of high-level strategic change imperatives or growth imperatives that an organization might have, and having traceability to specific technology projects and investments that we may make. So at the end of the year, as you’re working on things like annual operating plans and the like, you can reflect across your portfolio: what has been the value of technology implementation? Was there a return on investment? And then plan for the new year. What do we want to do next, and really rank order, based on the bang for the buck, so to speak, rank order what our investments will be on a periodic basis,
Jesse Harris 18:47
That also brings something up about: where you start in this process really matters, how the business is structured. And if you don’t have that nitty-gritty understanding of what everybody’s tasks are and how they’re implemented, then you kind of can’t even start with the tech-evaluation thing. Especially because I can imagine companies that don’t have that structure in place, they’re trying to implement technical solutions to problems that are not really the problems that they’re facing.
Andrew Anderson 19:14
Yeah, I mean, in a lot of ways, it’s the classic “I’m too busy chopping wood to sharpen my axe.” Thinking about your process, reporting on your process, counting the so-called time and motion studies that those of us that went to business school learned about, and, you know, that’s work. It’s work to self-reflect. And if I’m busy chopping wood, it’s difficult for me to look at my axe and even say, “Hey, do I need to sharpen my axe?” Taking a break is cost in a lot of ways. So I get it, right. It’s difficult. It’s really difficult to change in organizations because we all want to be productive. Right?
And so, one of the things I learned about as a leader here at ACD/Labs is: we want to give our really smart people time for self-reflection. Give them the tools and the encouragement to think beyond what they do today. Do they have ideas for doing things more efficiently? To explore.
And I’ll go back to my time in the lab for a minute. Fresh out of a bachelor’s degree as a lab assistant supporting an NMR group at Pfizer. I remember some of the more seasoned colleagues of mine would talk about so-called Friday-afternoon experiments. Those were my favorite times. If I reflect on my time in the lab, some of my best ideas came from times where I purposefully built in time to not have to run samples, to not have to prepare reports, to not have to conduct certain experiments. I’d conduct experiments that I thought were exploratory in nature, the so-called Friday-afternoon experiments, and some of my best scientific work came from ideas uncovered during Friday-afternoon experiments.
So we as leaders need to certainly carve out time for our organizational stakeholders to self-reflect and to think about the processes they execute as an individual, as teams, as departments, and as core facilities and functions. So it’s difficult, right, with the highly efficient processes that are in place now, especially in regulated industries. I think the payoff for instituting time for self-reflection is warranted. It warrants the investment.
Charis Lam 21:52
Yeah, that’s a really good point, Andrew, about taking time out of today’s efficiency to improve tomorrow’s efficiency. And once we have sort of done that self-reflection, identified that need, where do we then turn to search for options for filling that gap? Where do we hear about software that we can use?
Andrew Anderson 22:16
That is a great question. So the good news is, when I had a formal technology-scouting goal, looking to establish a landscape in a particular technological area, there are certainly purpose-built tools for that type of landscaping effort. One thing that I always like to say to folks is: make sure you try to stand on the shoulders of giants, right? Do not go into an area that you don’t have familiarity in and attempt to find solutions, because we all have our own set of preconceptions and biases, the so-called confirmational bias that we might bring to a problem or a gap analysis or looking for technology.
So my strategy in first looking to prosecute a technological area, find a landscape of technologies that meet a certain need, is you find a co-conspirator, a human that is familiar with the area, a so-called subject-matter expert, and those subject-matter experts can be inside your organization or outside your organization. And even extend that subject-matter ecosystem to the need assessment, right? I may not know what all of my organizational needs are. And so you rely on internal subject-matter expertise to just get more depth into an area, you know, get really explicit needs. And then from there, when I’m looking at technology, getting subject-matter expertise to help you find technology. Yes, there are purpose-built systems for technology evaluation and business-process mapping, absolutely. But it’s also good to rely on those giants, so to speak, to find the right things.
I’d even extend it a little bit further, in that you may want to consider subject-matter experts that are outside of a particular discipline. I’ll pick an example. Cross-fertilization is a really interesting facet of technology assessment and technology scouting. A lot of really interesting technology implementations and innovation in general is where you take a technology or a scientific skill set that is relegated to one discipline, to one industry, to one area, and reposition it. Lift it and adapt it out of that discipline or area and reapply it in a new area.
An example of that from my old life is taking molecules that have a certain application or utility in one area and saying, “Hey, can we redeploy this in another area?” For example, things like adhesives and coatings, are there food-grade alternatives that are edible and safe for ingestion that can be used in food applications? So if I have things like moisture barriers, as an example, are there food-grade films, or adhesives, or coatings, that could be deployed in a multi-textured snack product, that would ensure that you have a crispy outer shell with a soft and chewy inner shell, and have a barrier in between the two layers that allows for shelf-stable products?
Charis Lam 25:57
That’s a great example. Makes me hungry.
Andrew Anderson 26:02
Jesse Harris 26:03
So you mentioned about talking to subject-matter experts, which is an important element in this. But also you want to be talking to the users themselves, right, in terms of evaluating what technologies make sense for them, because they’re the experts in what they do. So how do you think about stakeholder engagement?
Andrew Anderson 26:21
Great question. Great question. As part of planning, what I like to look at is, first, define stakeholders and their roles. Sort of purveyors of technology, right? Users are one class of stakeholders. The simplest model I have in my head is: I’ve got users of a technology, I’ve got folks that influence those users somehow. And then beneficiaries, if there’s users and influencers that are using a particular technology upon implementation, who benefits downstream from their use?
Indeed, in addition to understanding explicit functional requirements of a technology, whatever it is, you also have to think about the so-called non functional requirements. And a layer above that is user experience. There’s a large and vibrant and passionate industry supporting so-called experience design. And so what we as scientists often do is rush to define explicit functional requirements of a system, a technology. And we often, admittedly, we often overlook the experience of using that technology, and what are the corresponding emotional, behavioral and psychological factors, which influence ultimately the successful implementation of a technology.
And those of us in the software world know, explicitly, if you don’t factor in user experience, it can lead to very complex interfaces that have a lot of functionality. But getting to using them can be a somewhat painful experience. So as a consequence, two things that I often recommend is: detail a so-called journey map with users, where as part of that journey map, you define the ideal experience. Not just in terms of what buttons am I clicking, what functions am I exposing to an interface? But how, and what is ideal? And what is the scenario where I will be using the system? Right?
One example is: as we build our applications, we traditionally think of scientists as being in the lab, next to their fume hood or next to their, in my case, NMR. Right? We know that those days are gone. You see a lot more work that goes on very far away from the labs. And I think that that type of work is going to change. People are going to reflect on their life before the pandemic, compare it to their life now, and realize there are certain activities that they perform that can be performed now, with the right digital infrastructure, performed away from the lab. So I think the future of work will be different than what it was before the pandemic, even when social-mobility restrictions are lifted in the hopefully short term.
So yeah, in general, looking at user experience, again, not just now, right? Don’t look at just functional requirements. Look at user experience. Think about the emotion and behavior and the physical environment that you’ll be doing this work, not just now, but in the future.
Jesse Harris 29:59
Yeah, that’s definitely the case. And I think that we can all think of examples of software that hasn’t met our needs in that respect, in one way or another. But definitely for stakeholders, though, bringing them into the decision-making process has got to be a bit of a balancing act, too, in terms of making sure that you include all the people who are important but not overburdening it so that it’s a “too many cooks in the kitchen” kind of experience. Do you have any ideas about that? And how do you manage it so that it both feels as if you’re getting good information, and not just a bunch of noise at the front end, but then also people actually feel like they had a stake in this and accept it when it actually gets implemented?
Andrew Anderson 30:37
Yeah, so I do. And we’ve certainly learned a lot in my tenure here in the last five years about carefully balancing needs, opinions, statements as a function of a role, and a function of priority. So, the so-called traceability matrix. And we talked about stakeholders or purveyors in technology adoption, classifying their roles as users, influencers, or beneficiaries. And looking at needs of each class. Start there.
So to use the cooking analogy, in the culinary world, if you’ve got a big kitchen, you’re going to have a lot of cooks in them. But each of those cooks have different roles and functions. And so you can have a very busy kitchen, so long as the roles, the tasks are assigned in an appropriate fashion, and your resources are effectively leveraged in some cohesive and synergistic way. So being that I worked at a food and beverage company, we interacted with a fair amount of culinary environments, and the best chefs in the world have a very large staff with almost military precision in the kitchen. Right? But their roles are well defined. And their tasks are well defined in accordance with their role. They don’t go beyond that.
So you can have lots of opinions, as long as those opinions are contextualized and classified in accordance with the role. It’s not a bad thing to have a lot of opinions, so long that opinion represents a particular classification of function. Does that make sense? All this talk about culinary, I’m starving.
Jesse Harris 32:32
I’m going right downstairs to the kitchen after I’m done.
Charis Lam 32:35
Those are great examples. Especially I liked how you turn that metaphor of cooks in the kitchen around and put a new spin on it. So with all this talk about the long process, all the way from needs-finding to quantifying gaps to talking to stakeholders, how long, in your experience, does that entire process take?
Andrew Anderson 32:58
Oh boy, something about angels on the head of a pin or something like that, you know. It’s really situation-dependent from my perspective. Let’s take the most basic case where I have a clear set of needs, all of my stakeholder needs have been identified, I have established a traceability matrix between my business process, my set of needs, and presumed capabilities that I will require of any technology. Right. And then I perform a landscaping effort. I evaluate prospective technological candidates against both quantitative assessment of their capabilities, as well as the perhaps qualitative assessment of stakeholders’ experience of utilizing or embracing that technology in an evaluative capacity in the beginning. And I’m able to rank order my technological candidates. I’m able to devise with the technology license holder, so to speak, some sort of implementation strategy. I have my cost stuff covered. I go through an implementation schedule of some sort. I, upon completion of that implementation schedule, I reflect to determine that, upon implementation, the implementation is yielding the success or not, as it was anticipated. And I can evaluate those key performance indicators to determine, “Hey, did I meet my mark or not?”
And so I would say for relatively straightforward technological implementations, it can be very fast. Aside from all the documentation work and the like. This can be days to weeks, right? Larger ones that are more disruptive, require more change control and adoption, or training, and the like, can take up to a year or more, in some cases.
I would also add that it’s actually rare to implement a technology off the shelf. Especially really disruptive ones. What I always advocate is stakeholders should embrace longer schedules, but also look at iteration and collaborative iteration. One of the things we do with our collaborators is iterate over time on technologies. So we almost co-create the ideal technology. And that can take, especially in software, that can take several version cycles.
But in a way, it’s a difficult question to answer, Charis, only because it really depends on the type of technology, the degree to which that technology is ready off the shelf, the degree to which you’ll be almost co-creating enhancements to that technology as you move forward. So it can be days to weeks to years, or, in some cases, decades, right. But in my world, it really depends on the project.
Charis Lam 36:03
Yeah, that makes sense. And I think it definitely comes back to that point you were trying to make about really understanding the scenario being the key thing.
Andrew Anderson 36:11
Yeah. Yeah, absolutely.
Charis Lam 36:14
And I guess that helps us go on to a summary of what you think is the most important thing that we should learn from this at the end. What’s your most important takeaway?
Andrew Anderson 36:23
Yeah, if I had to pick a statement, perhaps a glib statement about any sort of change, technology scouting, adoption, implementation, etc, it’s “measure twice and cut once.” Make sure your plan is robust and reliable before you begin to make those changes. I have certainly seen, let’s say, a less-than-articulate needs assessment, a less-than-articulate plan for evaluation, yield spending that ultimately doesn’t yield a result that is ideal for an organization.
I read a quote recently, where 70% of so-called digital-transformation projects fail, or at least fail to meet the overall expectations of leadership. One of the factors that I would certainly point to is that measuring twice, right, understanding the system in detail, understanding the time and motion studies, all of the actors involved in a process, arriving at a fully throated specification, both in terms of non-functional and functional specs. If you miss a piece, and you implement prematurely a technology, it may have gaps that ultimately put at risk the overall implementation. So we want to certainly do as much as we can at the outset of a tech-scouting mission, that needs assessment, to maximize the success of the project. Ultimately, in summary, everybody listening: measure twice, cut once.
Jesse Harris 38:12
Well, thank you so much. You’ve been generous with your time and it was a really interesting take. Very high level and into the details and examples, which was lovely. It was lovely to have you and lovely to chat with you today.
Andrew Anderson 38:24
All right, thank you, Jesse. Thank you, Charis. And I’m certainly around for those listeners who would love to dive deeper into this subject; I can be reached. Absolutely.
Charis Lam 38:36
Thanks very much, Andrew.
Jesse Harris 38:38
“Measure twice, cut once” is a great piece of advice that applies to so many activities, and technology scouting is apparently among them.
Charis Lam 38:47
And I think one of the important things to take away from this conversation is to be really aware of the need and dive into it first, to look at the big picture and the strategy before you go into the tactics.
Jesse Harris 38:59
We hope you took something away from this conversation. For more information, check out Andrew’s blog post at blog.acdlabs.com. This is The Analytical Wavelength. See you next time.
Charis Lam 39:10
The Analytical Wavelength is brought to you by ACD/Labs. We create software to help scientists make the most of their analytical data by predicting molecular properties, and by organizing and analyzing their experimental results. To learn more, please visit us at www.acdlabs.com.
Enjoying the show?
Suscribe to the podcast using your favourite service.