Brian Boland testified about shifting from “deep blind faith” in Meta, to becoming its public critic.
Brian Boland testified about shifting from “deep blind faith” in Meta, to becoming its public critic.


Brian Boland spent more than a decade figuring out how to build a system that would make Meta money. On Thursday, he told a California jury it incentivized drawing more and more users, including teens, onto Facebook and Instagram — despite the risks.
Boland’s testimony came a day after Meta CEO Mark Zuckerberg took the stand in a case over whether Meta and YouTube are liable for allegedly harming a young woman’s mental health. Zuckerberg framed Meta’s mission as balancing safety with free expression, not revenue. Boland’s role was to counter this by explaining how Meta makes money, and how that shaped its platforms’ design. Boland testified that Zuckerberg fostered a culture that prioritized growth and profit over users’ wellbeing from the top down. He said he’s been described as a whistleblower — a term Meta has broadly sought to limit for fear it would prejudice the jury, but which the judge has generally allowed. Over his 11 years at Meta, Boland said he went from having “deep blind faith” in the company to coming to the “firm belief that competition and power and growth were the things that Mark Zuckerberg cared about most.”
Boland last served as Meta’s VP of partnerships before leaving in 2020, working to bring content to the platform that it could monetize, and previously worked in a variety of advertising roles beginning in 2009. He testified that Facebook’s infamous early slogan of “move fast and break things” represented “a cultural ethos at the company.” He said the idea behind the motto was generally, “don’t really think about what could go wrong with a product, but just get it out there and learn and see.” At the height of its prominence internally, employees would sit down at their desks to see a piece of paper that said, “what will you break today?” Boland testified.
Zuckerberg consistently made his priorities for the company abundantly clear, according to Boland. He’d announce them in all hands meetings and leave no shadow of a doubt what the company should be focused on, whether it was building its products to be mobile-first, or getting ahead of the competition. When Zuckerberg realized that then-Facebook had to get into shape to compete with a rumored Google social network competitor (which he didn’t name, but seemed to refer to Google+), Boland recalled a digital countdown clock in the office that symbolized how much time they had left to achieve their goals during what the company called a “lockdown.” During his time at the company, Boland testified, there was never a lockdown around user safety, and Zuckerberg allegedly instilled in engineers that “the priorities were on winning growth and engagement.”
Meta has repeatedly denied that it tries to maximize users’ engagement on its platforms over safeguarding their wellbeing. In the past weeks, both Zuckerberg and Instagram CEO Adam Mosseri testified that building platforms that users enjoy and feel good on is in their long-term interest, and that’s what drives their decisions.
Boland disputes this. “My experience was that when there were opportunities to really try to understand what the products might be doing harmfully in the world, that those were not the priority,” he testified. “Those were more of a problem than an opportunity to fix.”
When safety issues came up through press reports or regulatory questions, Boland said, “the primary response was to figure out how to manage through the press cycle, to what the media was saying, as opposed to saying, ‘let’s take a step back and really deeply understand.” Though Boland said he told his advertising-focused team that they should be the ones to discover “broken parts,” rather than those outside the company, he said that philosophy didn’t extend to the rest of the company.
On the stand the day before, Zuckerberg pointed to documents around 2019 showing disagreement among his employees with his decisions, saying they demonstrated a culture that encourages a diversity of opinion. Boland, however, testified that while that might have been the case earlier in his tenure, it later became “a very closed down culture.”
Since the jury can only consider decisions and products that Meta itself made, rather than content it hosted from users, lead plaintiff attorney Mark Lanier also had Boland describe how Meta’s algorithm works, and the decisions that went into making and testing it. Algorithms have an “immense amount of power,” Boland said, and are “absolutely relentless” in pursuing their programmed goals — in many cases at Meta, that was allegedly engagement. “There’s not a moral algorithm, that’s not a thing,” Boland said. “Doesn’t eat, doesn’t sleep, doesn’t care.”
During his testimony on Wednesday, Zuckerberg commented that Boland “developed some strong political opinions” toward the end of his time at the company. (Neither Zuckerberg nor Boland offered specifics, but in a 2025 blog post, Boland indicated he was deleting his Facebook account in part over disagreements with how Meta handled events like January 6th, writing that he believed “Facebook had contributed to spreading ‘Stop the Steal’ propaganda and enabling this attempted coup.”) Lanier spent time establishing that Boland was respected by peers, showing a CNBC article about his departure that quoted a glowing statement from his then-boss, and a reference to an unnamed source who reportedly described Boland as someone with a strong moral character.
On cross examination, Meta attorney Phyllis Jones clarified that Boland didn’t work on the teams tasked with understanding youth safety at the company. Boland agreed that advertising business models are not inherently bad, and neither are algorithms. He also admitted that many of his concerns involved the content users were posting, which is not relevant to the current case.
During his direct examination, Lanier asked if Boland had ever expressed his concerns to Zuckerberg directly. Boland said he’d told the CEO he’d seen concerning data showing “harmful outcomes” of the company’s algorithms and suggested that they investigate further. He recalled Zuckerberg responding something to the effect of, “I hope there’s still things you’re proud of.” Soon after, he said, he quit.
Boland said he left upwards of $10 million worth of unvested Meta stock on the table when he departed, though he admitted he made more than that over the years. He said he still finds it “nerve-wracking” every time he speaks out about the company. “This is an incredibly powerful company,” he said.