Sedric monitors the communications of employees at financial institutions to ensure compliance

0
140


For financial institutions, complying with regulations is becoming a costlier proposition. According to a recent poll, 76% of financial services firms increased their compliance expenditure from 2022 to 2023, with most blaming new laws.

With the cost of compliance averaging out to around $10,000 per employee these days, many firms are searching for ways to reduce spending without running afoul of regulators. Entrepreneurs Nir Laznik and Eyal Peleg say that they’ve created a solution — powered by generative AI, as is the trend.

Laznik and Peleg co-founded Sedric, an AI-powered platform designed to help financial institutions implement compliance rules and flag possible issues. Prior to Sedric, Laznik spearheaded several startups, including a photo kiosk software firm, while Peleg spent close to eight years at Intel’s AI and machine learning org.

“We realized there was disproportionate pressure on mid-size organizations, combined with a new set of challenges for banks,” Laznik said. “We knew the rapid advancements in AI could address these problems in an entirely new way. This convergence of factors led us to create Sedric.”

Sedric’s AI acts as an overseer of sorts, monitoring a workforce’s outbound and inbound calls, chats, emails, social media DMs, and instant messages. It attempts to flag compliance problems (e.g. omitted disclosures, missed steps, and misconduct) as they happen; Sedric can automatically “mitigate” issues and provide coaching to the offending staff in many cases, Laznik claims.

“This technology empowers compliance officers with a holistic view of their customer touchpoints across multiple channels, allowing them to flag deviations from established compliance policies and guidelines quickly and efficiently,” Laznik said. “Our platform covers the entire compliance lifecycle, from policy setting to enforcement, correction and audit.”

A screenshot of Sedric’s backend dashboard.
Image Credits: Sedric

Surveillance that deep might sound a little intrusive — not least of which because Sedric “scores” interactions on a per-employee basis according to adherence to company policies. But, for better or worse, U.S. state and federal guidelines give wide discretion to businesses engaged in monitoring their staff so long as the businesses are reasonably transparent about it.

Moreover, some federal-level regulations — particularly regulations pertaining to insider trading, collusion, and the sharing of certain earnings documents — mandate that financial institutions closely track workers in their interactions with customers and the broader marketplace. These preempt state laws, like New York’s and Connecticut’s, that impose additional requirements on employers conducting workforce monitoring.

I asked Laznik about the potential for bias in Sedric’s AI, given that the AI is likely to be monitoring communications of staff from all different backgrounds. Biased AI can lead to discrimination, depending on where and how it’s deployed — whether intentional or no.

Studies have shown that some AI trained to detect toxicity sees phrases in African-American Vernacular English, the informal grammar used by some Black Americans, as disproportionately “toxic.” Other studies have demonstrated how speech recognition systems are more likely to wrongly transcribe audio of Black speakers as opposed to their white counterparts.

Laznik says that Sedric uses “fine-tuned models” trained on “proprietary datasets curated and validated in collaboration with industry experts” to try to minimize bias. The company also monitors for performance dips in deployed models and retrains models when necessary.

Sedric
Image Credits: Sedric

“Our platform enables customers to provide direct feedback through various annotation inputs, which are then vetted by compliance teams and are used for re-training or integrated into the prediction process,” he added. “This ensures that our models become increasingly tailored for each customer.”

To protect customer — and employee — privacy and security, Sedric allows companies to configure where their data is stored and implement controls that redact (or at least attempt to redact) personally identifiable information.

“At Sedric, we’ve designed our platform with compliance and security at its core,” Lazink said. “Enterprises can set their own retention policies and compliance guidelines according to their internal guidelines and specific regulations.”

Sedric, which also offers tools to support call center agents as they’re chatting with clients on the phone, has “hundreds” of paying compliance officers and enterprise customers in the U.S. and Europe, Laznik claims.

Revenue has increased fivefold over the last year — although Laznik declined to give firmer numbers.

“For small- and medium-sized businesses, we offer an off-the-shelf solution, and for enterprises and banks, we offer a hybrid model with tailored customizations,” Lazink said. “Our focus on the specific needs of financial institutions, combined with our proprietary library of pre-trained, regulation-inspired models that can also be customized to each organization’s unique requirements, sets us apart in the market.”

Going after finance customers and use cases specifically certainly appears to have worked in Sedric’s favor, setting the company apart from workplace monitoring rivals such as Fairwords, Shield, Erudit and Aware. It’s a crowded — and often controversial — market, but investors still sense some opportunity, particularly as AI becomes more deeply embedded in these types of tools.

Case in point: Seemingly pleased with Sedric’s progress so far, Foundation Capital led an $18.5 million Series A investment in the four-year-old company that also had participation from Amex Ventures. The new cash will be put toward growing the firm’s go-to-market and R&D teams “significantly” in NYC and Tel Aviv, Lazink said, and brings New York-based Sedric’s total raised to $22 million.

Sedric plans to double its headcount in the next 12 months.



Source link