Shelley Webb has migrated to a role in her career that many senior in-house legal professionals know well: the attorney who wears two corporate hats. Webb is the general counsel and chief people officer at PagerDuty, a technology provider that offers companies a digital platform to help them oversee operations and manage rapid incident responses.
She joined PagerDuty in April 2022 after eight years as in-house counsel at Intel, including working as lead attorney for the company’s PC business. Webb began her career as a corporate litigator with Williams & Connolly in Washington after receiving her bachelor’s degree in economics from the University of Virginia and her JD from Stanford Law School.
San Francisco-based PagerDuty — whose employees call each other “Dutonians” — has a legal team of about 25 people and a human resources department of roughly 70 people. The company says it works with nearly 70% of Fortune 100 companies.
Editor’s note: Legal Dive’s conversation with Webb has been edited for clarity and length.
LEGAL DIVE: How did you become interested in law?
SHELLEY WEBB: As an undergrad I took a class in antitrust law that was taught by this esteemed economics professor [Ken Elzinga] who served as an expert witness in antitrust cases. I was an economics and government major. I’d been considering pursuing a Ph.D. in economics, and that antitrust law class brought together so many things that I loved. I loved the economics and underpinnings of it, but I also loved the reading, the persuasive writing and the logical reasoning and analytical thinking. So when I was 19 years old, I decided I wanted to be an antitrust lawyer. I’m really proud I accomplished that. But I had no idea it was going to become such a mainstream media topic, or that it would bring me to Silicon Valley.
How different was the work when you left the law firm for Intel?
Being in the litigation and antitrust group at Intel, there were elements that were similar to my outside counsel practice, because I was still running litigation. And then when moving from a litigation role to a product role the changes were greater, because you’re focused on understanding the strategy and what legal and policy tools you have to help solve business problems.
And now you have HR responsibilities. What is that like?
I think about my role as having three big responsibilities. One is people, the second is strategy and the third is execution — in that order. When you do those three things well, there’s really no problem you can’t solve. So job No. 1 for me as the leader of two departments is to get the right people in the right roles. When I’m recruiting and develping internal talent, I’m looking for people who have a record of excellence in doing hard things. And, especially in-house, they have to be able to harness the talents of others so success is achieved as a team. Otherwise, you can’t get anything done at a company.
Once you’ve got the right people in the right roles, it’s our job as leaders to drive the department’s strategy. For both the legal team and the people team, that strategy has to be grounded in the business strategy. Back in my earliest days supporting a product group, when we had an annual process of developing OKRs — objectives and key results — I began a practice of nesting the legal teams’ OKRs underneath the business OKRs, so that the business could see how each key legal result would contribute to the success of the business. Once you do those two things, you've got people and strategy down. All that’s left to do is execute. And so much of execution boils down to following standardized processes, like creating project plans and clarifying roles, and documenting that.
Do you aim for a 50-50 split of your time?
It ebbs and flows. The more time you spend in any area you can gradually scale back, because you have more familiarity with it. I’ve been in the chief people officer role for just over six months. So in the months where it's been much newer to me, I've spent more of my time trying to distill a lot of information. And then it varies week by week, depending on what's going on and whether there are legal issues. If there’s a board meeting or preparation for one, I'll be spending more of my time with my legal hat on. But in other weeks, if there’s executive hiring or people development strategy planning, I’ll be spending more time on the people role.
Are you using any AI legal tools? And how do you use them?
One of the areas that I’m most excited about is this evolving opportunity to use AI to speed up contract reviews and negotiations. We’re exploring tools that allow us to upload our contract negotiation playbooks and have the AI make suggestions on fallbacks within the document. This functionality will get smarter with continued use, and it could be a time saver for us as a first draft in a standard negotiation. That is one that could allow us to drive efficiency and repurpose the great legal expertise within our department to the highest value uses. Aside from the contract piece, there are other time savers that some of our legal team members are using. Someone wise once told me that the hardest thing in work is knowing whether to spend 10 seconds, 10 minutes or 10 hours on something. That remains excellent advice, but with the AI tools at our disposal to help generate efficiencies, something that might have taken 10 minutes before could now take 10 seconds.
Where do the cost savings come from when a GC gets these AI tools? Because you’re probably not thinking about laying off people.
No, and I’ve always operated with lean teams. As the company scales, I’ve seen existing team members repurpose their time, so when you have higher volumes of contracts, or a greater volume of advice that you have to be giving, you don’t have to add headcount in a way that people would necessarily think they needed to before. You can scale the existing resources and give your team members opportunities to do the most interesting legal work on novel issues, as opposed to standardized, routine, repetitive work.
If a contract review was five hours before, and now we outsource to the AI, the lawyer’s five hours will be deployed somewhere else. Is that it?
Or maybe four of their five hours will be deployed somewhere else. They’ll have the first pass done by AI, but they’ll be able to use that as a starting point. I always feel like it’s so much easier to edit something than it is to stare at the blank piece of paper. So when AI is doing that for us, it gives you that head start. In addition to thinking about how we’re using AI as a department, a lot of the time I spend — and that my legal team members who are focused on AI spend — thinking about how we're incorporating AI into our products.
Let’s discuss state and federal AI regulation. California has done a bit of that, and more may be coming. What do you think the regulations should look like?
The best way is a risk-based approach. You have to understand the roles that the different players have in the ecosystem. Are you somebody who is developing large language models and therefore has the ability to control what kind of data is going into them and how they're being trained and created? Or are you somebody who is deploying the large language models, and building product offerings on top of existing models for customers? So the responsibilities assigned to each type of player in that value chain is an important distinction.
Where do you think this regulation should reside? At the international, national or state level?
The most important goal is consistency. So, there are multiple ways one could get that. Having a patchwork of laws is the hardest place to be.
Do you think we’ll see the tech community go to Congress and ask for federal laws, to avoid Colorado doing one thing, California doing something else?
There’s certainly advocacy ongoing today to establish federal AI legislation as an opportunity for the U.S. to establish a thought leadership position and help ensure consistency and avoid a patchwork of state laws. It’s similar to what we’ve seen with privacy legislation. At this stage, it’s unknown whether that kind of legislation will pass. But I think the most important thing, regardless of what level the legislation is at, is that we have a principles-based approach with consistent obligations on the companies.
What do you expect we’ll see moving from the Biden administration to Trump in regulating the tech community?
We don’t know what personnel will be appointed to the cabinet and leadership of the agencies, and that has a big impact. Certainly from statements that were made on the campaign trail, one would expect changes in the approach to several technology issues like AI, cryptocurrency, antitrust and competition. But at this stage, how those shifts will play out, and also over what time period, is unknown. So I wouldn’t want to speculate.