> But just because a company has appointed a CRO doesn’t necessarily mean that it has made risk management a high priority.
Priority or not, it suggests the company doesn't understand risk. In a company that doesn't look at risk-adjusted rates of return as a natural part of how they do things a CRO is mild bad sign.
An analogy might be helpful. Testing code is, with some squinting, a form of institutionalised risk management. Any particular test doesn't necessarily do anything useful, but they apply a certain level of pressure that means the code in general fails less and force people to think more about how they're writing their functions. If a company tells you that it has a special pool of coders who add tests, separate from the ones that write the actual code, that is a bad sign that they know how to do testing. A huge chunk of the value is forcing the person who makes the front line decisions to think about what they are doing. Not to say a dedicated testing team doesn't sometimes make sense in some unusual companies, but it is an exception to the rule. Risk management isn't the type of responsibility that should be separated out into a separate role for most companies because that is much less valuable than the people doing the work being part of a management chain that understands risk.
I meant that in the sense that a typical SaaS company has no reason to be formally thinking about risk adjusted returns and therefore has no need of a CRO. If anyone cares product can do a guesstimate or something. Most companies shouldn't have a CRO.
If you’re a B2B SaaS with no CRO, good luck with vendor assessments. B2C you can skip it before reaching a critical mass where regulatory pressures will mandate it.
Agreed. This maps directly to the white-box vs black-box testing distinction: either you own your priors and trace the full data lineage from training through validation, or you're relying on an opaque validation set of unknown provenance. And that's before factoring in the organizational politics.
I find this line of thinking similiar to copmanies with "innovation" officers. That is, having an employee who is in "charge of" innovation implies all other employees dont?
You completely miss the role of CROs or risk function in an organisation. Using your analogy, the Chief Testing Office would not write the tests. They would establish how test coverage is defined and measured, and the target coverage. They would monitor the progress of each team in meeting these targets. It is a governance role that sits as a second line behind the first line that has the immediate responsibility to manage the risk.
Risk adjusted rates are not traditionally in the mandate of a CRO. They sit with Finance or Treasury. And they should be abstracted from front line, who would experience them only through optimisation of their funding.
This sounds well lined up with what I was saying? The CRO doesn't manage risks. Having him in with the executives is a signal that the company is putting resources into communicating with the regulators rather than that they are committed to managing risks in any way. That isn't what these regulatory-heavy roles are for. Their job is to make sure the regulators don't investigate. That is in no way a signal that the company has any ability at risk management, and is a slight signal that they might think "risk" just means that the government will sue them or shut them down.
If a company were actually serious about managing the risks it'd be some relatively quiet role reporting to someone responsible for operations like a CTO, COO or head of product. Maybe part of the CEOs personal staff but not an exec.
I would argue that testing code is “Risk Mitigation” not “Risk Management”.
It is nuanced, but at least in large Systems Engineering orgs, Risk Management is typically a different thing entirely.
It entails documenting known risks, evaluating the likelihood and potential impacts, defining mitigating actions, tracking the closure of those actions and the resultant reduction in the likelihood of the risk manifesting.
This is both centralized and distributed. The specific SMEs provide most of this input/definition, but it is also useful to have a centralized understanding of all the system risks by someone with a system level purview.
> If a company tells you that it has a special pool of coders who add tests, separate from the ones that write the actual code, that is a bad sign that they know how to do testing.
I disagree this is necessarily a bad sign. The people writing the code have blind spots and they may also not necessarily be experts at testing. Probably the highest quality software I ever worked on was in a setup where we had a combination of the developers writing tests plus dedicated people who wrote only tests. That said, I think this setup is secondary to the quality and experience of the teams and the individuals.
> A huge chunk of the value is forcing the person who makes the front line decisions to think about what they are doing.
I would look at this differently and say a huge chunk of the value is coming from making sure you have the right person in the front line. The wrong person being "forced" to make decisions they're not good at is not going to help you a lot. The right person doesn't need forcing to make the right decisions. People and culture drive outcomes and not process.
Financial risk management is a great industry for data science. I've been doing it for 15+ years. It is amazing the data rich environment which includes credit bureaus, customer transaction history, call center dynamics, and of course finances. It is a gold mine of opportunity to find new and fresh ways to observe the organization. This is also what makes risk management difficult as well. There is so much to know. LLMs are going to change risk management as it is changing every other industry. It will be interesting to see where it is headed.
> But just because a company has appointed a CRO doesn’t necessarily mean that it has made risk management a high priority.
Priority or not, it suggests the company doesn't understand risk. In a company that doesn't look at risk-adjusted rates of return as a natural part of how they do things a CRO is mild bad sign.
An analogy might be helpful. Testing code is, with some squinting, a form of institutionalised risk management. Any particular test doesn't necessarily do anything useful, but they apply a certain level of pressure that means the code in general fails less and force people to think more about how they're writing their functions. If a company tells you that it has a special pool of coders who add tests, separate from the ones that write the actual code, that is a bad sign that they know how to do testing. A huge chunk of the value is forcing the person who makes the front line decisions to think about what they are doing. Not to say a dedicated testing team doesn't sometimes make sense in some unusual companies, but it is an exception to the rule. Risk management isn't the type of responsibility that should be separated out into a separate role for most companies because that is much less valuable than the people doing the work being part of a management chain that understands risk.
What risk measures for risk adjusted returns would you use (e.g., in SaaS)?
I meant that in the sense that a typical SaaS company has no reason to be formally thinking about risk adjusted returns and therefore has no need of a CRO. If anyone cares product can do a guesstimate or something. Most companies shouldn't have a CRO.
If you’re a B2B SaaS with no CRO, good luck with vendor assessments. B2C you can skip it before reaching a critical mass where regulatory pressures will mandate it.
Agreed. This maps directly to the white-box vs black-box testing distinction: either you own your priors and trace the full data lineage from training through validation, or you're relying on an opaque validation set of unknown provenance. And that's before factoring in the organizational politics.
I find this line of thinking similiar to copmanies with "innovation" officers. That is, having an employee who is in "charge of" innovation implies all other employees dont?
You completely miss the role of CROs or risk function in an organisation. Using your analogy, the Chief Testing Office would not write the tests. They would establish how test coverage is defined and measured, and the target coverage. They would monitor the progress of each team in meeting these targets. It is a governance role that sits as a second line behind the first line that has the immediate responsibility to manage the risk.
Risk adjusted rates are not traditionally in the mandate of a CRO. They sit with Finance or Treasury. And they should be abstracted from front line, who would experience them only through optimisation of their funding.
This sounds well lined up with what I was saying? The CRO doesn't manage risks. Having him in with the executives is a signal that the company is putting resources into communicating with the regulators rather than that they are committed to managing risks in any way. That isn't what these regulatory-heavy roles are for. Their job is to make sure the regulators don't investigate. That is in no way a signal that the company has any ability at risk management, and is a slight signal that they might think "risk" just means that the government will sue them or shut them down.
If a company were actually serious about managing the risks it'd be some relatively quiet role reporting to someone responsible for operations like a CTO, COO or head of product. Maybe part of the CEOs personal staff but not an exec.
I would argue that testing code is “Risk Mitigation” not “Risk Management”.
It is nuanced, but at least in large Systems Engineering orgs, Risk Management is typically a different thing entirely.
It entails documenting known risks, evaluating the likelihood and potential impacts, defining mitigating actions, tracking the closure of those actions and the resultant reduction in the likelihood of the risk manifesting.
This is both centralized and distributed. The specific SMEs provide most of this input/definition, but it is also useful to have a centralized understanding of all the system risks by someone with a system level purview.
> If a company tells you that it has a special pool of coders who add tests, separate from the ones that write the actual code, that is a bad sign that they know how to do testing.
I disagree this is necessarily a bad sign. The people writing the code have blind spots and they may also not necessarily be experts at testing. Probably the highest quality software I ever worked on was in a setup where we had a combination of the developers writing tests plus dedicated people who wrote only tests. That said, I think this setup is secondary to the quality and experience of the teams and the individuals.
> A huge chunk of the value is forcing the person who makes the front line decisions to think about what they are doing.
I would look at this differently and say a huge chunk of the value is coming from making sure you have the right person in the front line. The wrong person being "forced" to make decisions they're not good at is not going to help you a lot. The right person doesn't need forcing to make the right decisions. People and culture drive outcomes and not process.
Financial risk management is a great industry for data science. I've been doing it for 15+ years. It is amazing the data rich environment which includes credit bureaus, customer transaction history, call center dynamics, and of course finances. It is a gold mine of opportunity to find new and fresh ways to observe the organization. This is also what makes risk management difficult as well. There is so much to know. LLMs are going to change risk management as it is changing every other industry. It will be interesting to see where it is headed.
I’m using LLMs to set up risk models for startups and SMBs (where there are no internal resources yet). It works.
I have a risk mapping tool live here: https://siqnalis.com/company (You can test it with “beta2026”.)
One model is live, but a lot of stuff is on the roadmap.
Classic stuff. True risk management is pretty hardcore science.
I’ve always taken a more casual, and “off-the-cuff” approach, and write about it here[0].
[0] https://littlegreenviper.com/risky-business/
Subtle now that it is anchient history ...
https://hatstore.co.uk/risk-management-department-dark-green...