Click for Takeaways: Military Leadership Meets Finance Discipline
  • The stress testing revolution: Banks with $100+ billion in assets faced mandatory Dodd-Frank testing starting in 2012, enduring a painful 2012-2017 learning curve that ultimately forced them to more than double their common equity capital since 2009.
  • Cloud’s seven-year compression: The migration from Hadoop clusters to cloud infrastructure over the past seven years compressed the embedded latency in FP&A cycles, shifting expectations from multi-week turnarounds to near real-time answers.
  • The data governance bottleneck: Without standardized data dictionaries and single sources of truth, teams waste time arguing over which version of numbers is correct rather than solving strategic problems, turning board decks into version battles.
  • Why commercial banking teaches business fundamentals: Analyzing loan repayment capacity forces you to understand how companies turn products into money at a mechanical level, providing more foundational knowledge than market-driven or analytics-driven roles.
  • AI’s regulatory lag: Non-deterministic models that produce different outputs as they learn can create audit and explainability challenges in regulated environments, keeping financial institutions 2-3 steps behind cutting-edge AI capabilities by necessity, not choice.

Most finance professionals think analytics success comes from having the best models. Bobby Bray spent 20 years learning that structured financial analytics actually starts with the best plumbing.

Bray is a CFA charter holder who built analytics functions at Capital One, advised Fortune 500 clients at Oliver Wyman, and commanded thousands of sailors as a Navy captain. His path from career-based helicopter pilot to leading analytics at one of the world’s largest financial institutions taught him something counterintuitive: the rigor that keeps aircraft flying is the same risk discipline finance teams need to make FP&A work at scale.

It’s just that most finance teams focus on the wrong part.

Why Analytics Fails at Scale

When Bray talks about building analytics teams, he doesn’t lead with machine learning algorithms or statistical modeling techniques. He leads with plumbing.

“At scale, analytics is less about the math and more about the plumbing and the people. It’s about having clean, reliable data, consistent definitions, and very importantly, a team culture that is curious but disciplined.”

That combination, curious but disciplined, creates tension most analytics teams never resolve. Curiosity pushes you to explore new approaches, question assumptions, generate novel insights. Discipline forces you to document processes, standardize definitions, and maintain audit trails. They pull in opposite directions.

But they’re not incompatible. They’re both required.

The problem is that most teams optimize for one or the other. The math-forward teams build sophisticated models on garbage data. The process-forward teams generate perfectly documented numbers that nobody trusts because they never ask hard questions.

“If there’s one key lesson, it’s don’t just focus on the model, focus on the process. That’s where the discipline comes from. And if you focus on the process and not just the model, you’ll get it right.”

Bray uses an aviation analogy. You need to follow the drop of gas through the engine, understanding every cog that transforms fuel into thrust. The same applies to analytics. You need to trace raw data through every transformation until it becomes a usable insight. If you can’t explain how a number got generated, you can’t trust it. And if you can’t trust it, it doesn’t matter how accurate the model says it is.

The Unglamorous Foundation Nobody Wants to Build

Data governance sounds boring because it is boring. It’s also the backbone of everything that matters in analytics and FP&A.

“Data governance is unglamorous, but it really is the backbone of trust. A good data dictionary, a clear source of truth, and clarity on latency prevents teams from arguing over numbers instead of solving problems.”

Here’s what breaks when you skip data governance: Finance doesn’t agree on baseline assumptions. Every forecast becomes a battle of versions instead of a single story. Board decks get delayed because three different departments report three different revenue numbers. Nobody can explain why the numbers changed between last quarter and this quarter.

Without standardized data dictionaries and single sources of truth, teams waste hours debating which revenue number is correct rather than discussing what the numbers mean for strategy. The board deck becomes a version control exercise instead of a strategic conversation.

Bray frames it simply. Data governance is housekeeping. Nobody celebrates having a clean house after spending a weekend tidying up. But try being functional in a workspace that’s disorganized chaos. You can’t. The same applies to data.

The challenge is nobody wants to pay for housekeeping. Executives want AI insights and predictive models. They don’t want to fund the tedious work of standardizing field definitions, documenting data sources, and maintaining audit logs. But without that foundation, everything built on top eventually collapses.

During his consulting days, Bray discovered a pattern. Every consultant enters the profession thinking they’ll solve massive strategic problems. Most end up doing the unglamorous work clients just don’t want to do themselves.

“It’s the reality. Data governance is a thing that people don’t wanna do. It’s just what it is, but it’s absolutely critical.”

That insight carries into how successful companies approach analytics. They recognize that competitive advantage comes from having cleaner pipes, not fancier models. When everyone has access to the same cloud infrastructure and the same machine learning tools, differentiation comes from execution discipline, not technical sophistication.

From Big Data to Just Data

Ten years ago, every conversation centered on big data. Hadoop clusters. Data lakes. Managing massive volumes that existing infrastructure couldn’t handle.

Then something shifted. The term disappeared. Not because the problem went away, but because cloud infrastructure made volume irrelevant.

“When cloud infrastructure became a thing and more people adapted to it and adopted it, the amount of data significantly increased. When that happened, it became a matter of scale. How do you distinguish between big data when everybody has access to massive data?”

The competitive advantage moved. It used to be that just having access to large datasets differentiated you. If you could stand up the infrastructure to process terabytes of information, you could outmaneuver competitors still stuck with traditional databases.

That advantage evaporated. Cloud platforms democratized access to unlimited compute and storage. Suddenly the constraint wasn’t collecting everything, it was curating what matters.

Bray sees companies that sprinted ahead because they made this shift early. They moved from “collect everything” to “understand what’s valuable.” From centralized data warehouses to distributed cloud architectures. From batch processing that took days to real-time analysis that takes seconds.

“Now it’s less about size and very much more about trust and speed. Speed is kind of maybe the new competitive advantage.”

For FP&A teams specifically, this created a fundamental transformation. The migration from Hadoop-era centralized systems to cloud infrastructure happened over roughly the past seven years. It changed the job description entirely.

“Instead of spending time just to produce forecasts, teams can spend more time interpreting those forecasts, they can pressure test it, they can ask the so what questions and what if questions that used to not be in the lexicon of an FP&A team.”

But that speed also compressed timelines. The embedded latency where stakeholders expected answers in weeks disappeared. Now they want answers in hours. That puts pressure on FP&A teams to upskill from number generators to strategic thinkers who can act quickly while maintaining analytical rigor.

What Capital Market Stress Tests Taught FP&A

Before 2008, banks did some scenario testing. It wasn’t rigorous. You generally passed your own tests.

Then came the financial crisis, followed by the Dodd-Frank Act and mandatory capital stress testing. Large U.S, banks faced a painful learning curve from 2012 to 2017. Regulators would give them economic scenarios: unemployment jumps to 10%, GDP contracts 3%, commercial real estate prices drop 30%. Banks had to project those scenarios onto their balance sheets and prove they could survive.

“Pre 2008, some banks did some sort of scenario testing. You kind of always passed yourself. And then after 2008 and subsequently with the Dodd-Frank Act, capital stress tests became mandatory for banks and there was a very, very painful learning curve.”

The consequences were real. Regulators would run the same scenarios using their own models, compare answers, and set capital requirements based on the results. If your model showed you needed less capital than the regulator’s model indicated, you lost. The regulator’s answer determined how much capital you had to hold, which directly impacted your profitability and ability to return cash to shareholders.

The discipline worked. In part due to stress testing and enhanced supervision, the largest banking organizations more than doubled their common equity capital in aggregate since 2009, according to Federal Reserve data.

That created pressure for rigor FP&A teams rarely experience. Your model had to be defensible to examiners. Your assumptions had to be documented. Your processes had to be reproducible. You couldn’t just adjust assumptions until you got the number you wanted because auditors would catch it and you’d face regulatory consequences.

“The teams learned and they borrowed the discipline of the capital markets teams. Document your assumptions, test the extreme model outcomes, do statistical analysis of a path-dependent scenario, and always ask, ‘What if we’re wrong?'”

Those lessons translated into FP&A even though the stakes are different. FP&A scenario analysis isn’t existential the way capital stress testing is, but it’s no less important. You’re preparing leaders for a range of outcomes so they can make better decisions.

The other critical shift: moving from point estimates to probability distributions.

In pre-stress testing days, you gave leadership a single number. Here’s next quarter’s revenue. Here’s full-year EBITDA. Those numbers were taken as fact.

Post-stress testing, leadership receives ranges with probabilities. There’s an 80% chance revenue falls between X and Y. Here’s what happens if we’re in the tail scenario. That requires a completely different skillset to interpret.

“Point in time number, range of outcomes with probabilities, completely different mindset on how you interpret those things.”

Most finance executives aren’t comfortable with that ambiguity. They want the answer, not a distribution. But Bray channels Nassim Taleb here. A 95% confidence interval will bite you if you’re not careful. Black swans exist. The model that always works perfectly is the one that destroys you when conditions change.

The AI Explainability Problem

Every CFO wants to know: when can we use AI in forecasting and planning?

Bray’s answer surprises people: not until you solve explainability.

“Without explainability, the answer may be right, but it’s worthless. Nobody’s gonna buy it. And even if people think it’s right, it won’t count until it’s explained.”

This creates a fundamental problem with generative AI in regulated finance environments. The models are black boxes. You can’t trace how they arrived at an answer. That makes them unusable not because they’re inaccurate, but because you can’t defend them to auditors.

Classical models, even complex ones, are deterministic. Same inputs, same formula, same output every time. If someone questions your forecast, you can walk them through every assumption and calculation. The model might be wrong, but at least you can explain why it produced that specific number.

“A model created by a team will get the same outcome every time if the input model doesn’t change. Same data, same formula, same outcome. That is not true in the case of AI. As AI models learn and ingest data, they will not necessarily produce the same results time after time.”

That non-determinism creates compliance nightmares. If the model produces different results as it learns, how do you audit it? How do you explain to regulators why the forecast changed, not because underlying business conditions changed, but because the algorithm updated itself?

This reality keeps financial institutions roughly two to three technology generations behind cutting-edge AI capabilities. Not because they lack technical talent or budget, but because their comfort level with unexplainable models has to catch up to the technology. That’s not bad risk management, it’s prudent risk management.

Bray’s pragmatic recommendation: use AI as an assistant, not an oracle. Let it surface patterns, suggest scenarios, flag anomalies. But humans make the final call and own the explanation.

On data privacy, Bray acknowledges the risks are real. You can expose competitive secrets or run afoul of regulations. Supervisory information from regulators is confidential by law. Share it with the wrong AI system and you face sanctions.

The solution isn’t avoiding AI. It’s implementing guardrails. Ring-fence sensitive data. Use secure platforms. Be explicit about what information gets fed into models.

“Innovation has to be paired with discipline. Those go hand in glove.”

JP Morgan offers the template. They’ve rolled out AI across major parts of the organization despite being in everything from investment banking to consumer lending. They figured out the guardrails that let them move fast while staying compliant. Other banks can learn from that speed of adoption.

Commercial Banking’s Hidden Value

Before Bray built analytics teams or advised Fortune 500 clients, he spent years in commercial banking. To this day, he considers it the most educational job he’s had in the private sector.

“Commercial banking is essentially about how companies make money. A commercial banker cares about lending money and how we’re gonna get paid back, and the way you figure out if a company can pay you back is figuring out how they turn a product into money.”

That’s a fundamental skill. It’s not market-driven like trading. It’s not analytic like quant research. It’s pre-AI. You have to understand the mechanics of how a business creates value: how it sources materials, manufactures products, sells to customers, collects cash, covers costs, and hopefully has money left over.

If a manufacturing company wants to borrow $5 million, you need to understand its entire operation. What’s their gross margin? How long does inventory sit before it sells? What happens to cash flow if a major customer delays payment by 30 days? Can they service debt during a recession when revenue drops 20%?

Those questions force you to think about business at a foundational level. Most finance roles skip this step. They start with market dynamics or sophisticated models without understanding the basic mechanics of value creation.

“That’s a fundamental skill of business. It’s not market-driven. It’s not analytic. This is turning product into money, and I use those skills all along.”

For FP&A professionals, especially, that grounding matters. If you can’t explain how the business makes money in simple terms, you can’t build reliable forecasts. The models become abstract exercises disconnected from operational reality.

From Engineering Mind to Leadership Mindset

Bray’s path from Naval Academy engineering major to helicopter pilot to finance executive taught him something most technical professionals miss: management and leadership are different skills.

Management is tasking. It’s applying skillsets to the workload. You have five people, ten projects, and forty hours per week. Management is the Tetris game of fitting capacity to demand. It’s necessary but not sufficient.

Leadership is psychology. It’s understanding what motivates people, what their “why” is, what makes them thrive or struggle.

“Leadership is fundamentally a practical application of psychology. Some people are motivated by money. Some people are motivated by just putting their head down and doing a good job. Some people are motivated by stability. If you have a person who’s motivated by stability and you try to stretch them, they may struggle.”

The military taught him two critical elements that translate directly to corporate finance: trust and clarity.

In aviation, you trust the person who worked on your aircraft, even if you barely know them. You trust your air crew with your life. Those stakes create a culture where trust isn’t optional.

That same trust carries into finance teams. If your analysts don’t trust the data, they’ll build shadow systems. If leadership doesn’t trust FP&A’s forecast, they’ll ignore it and make decisions based on gut feel. Trust is the foundation on which everything else builds.

Clarity means people know what success looks like and why their role matters. When can we hang the banner? What does winning look like? If team members don’t know the answer, they’re just completing tasks without understanding the mission.

“People need to know what success looks like and why their role matters. This goes back to psychology. Why are you doing what you’re doing and when can we hang the banner?”

Wind the Clock

The first lesson flight instructors teach seems counterintuitive: if you have an emergency, wind the clock.

The old mechanical clocks in legacy training aircraft would wind down. Reaching out to wind the clock forces you to pause.

“If you take the time to reach out there and wind the clock, you prevent yourself from making rash, rushed decisions.”

Bray carries that into finance. When something breaks, when numbers don’t make sense, when executives demand immediate answers, he winds the clock. Some people think that means indifference. It doesn’t. It means processing carefully rather than reacting rashly.

That discipline, paired with decisiveness, defines his approach. The military teaches you to act with imperfect information. You rarely have everything you need, but you still have to make the call and own the outcome. Finance is the same. You’ll never have perfect data or perfect models. At some point, you have to decide and move forward.

Most finance professionals never develop that comfort with decision-making under uncertainty because the corporate world lets you push decisions up the chain. But eventually you reach the top of the chain. And if you haven’t practiced making calls with incomplete information, that’s a hard skill to learn late in your career.

The companies that win at analytics aren’t the ones with the fanciest AI models. They’re the ones that built clean pipes, maintained rigorous processes, and developed the curious, disciplined culture that defines high-performance finance teams that make decisions quickly with imperfect information.

That’s what “wind the clock” really means. Take a breath. But then make the call.

Where Datarails Fits

At Datarails, we understand that structured financial analytics starts with data infrastructure, not algorithms. Our Excel-native FP&A platform consolidates data from across your organization into a single governed source of truth, giving your team the clean pipes and consistent definitions that Bray describes as the foundation for analytics at scale. Because when your data governance is solid and your processes are documented, you can finally spend time interpreting forecasts and asking “what if” questions instead of arguing about which version of the revenue number is correct. That’s when FP&A transforms from number generation to strategic decision support.

This article is based on Bobby Bray’s appearance on the FP&A Today podcast

Bobby Bray is a CFA charter holder who has built analytics functions at Capital One and advised Fortune 500 clients at Oliver Wyman. A retired Navy captain who commanded thousands of sailors and flew carrier-based helicopters, he brings military precision to financial data governance, stress testing, and analytics team leadership.