The prevalence of so-called fake news is far worse than we imagined even a few months ago. Just last week, Twitter admitted there were more than 50,000 Russian bots trying to confuse American voters ahead of the 2016 presidential election.

It isn’t just elections that should concern us, though. So argues Jonathon Morgan, the co-founder and CEO of New Knowledge, a two-and-a-half-year-old, Austin-based cybersecurity company that’s gathering up clients looking to fight online disinformation. (Worth noting: The 15-person outfit has also quietly gathered up $1.9 million in seed funding led by Moonshots Capital, with participation from Haystack, GGV Capital, Geekdom Fund, Capital Factory and Spitfire Ventures.)

We talked earlier this week with Morgan, a former digital content producer and State Department counterterrorism advisor, to learn more about his product, which is smartly using concerns about fake social media accounts and propaganda campaigns to work with brands eager to preserve their reputation. Our chat has been edited lightly for length and clarity.

TC: Tell us a little about your background.

JM: I’ve spent my career in digital media, including as a [product manager] at AOL when magazines were moving onto the internet. Over time, my career moved into machine-learning and data science. During the early days of the application-focused web, there wasn’t a lot of engineering talent available, as it wasn’t seen as sophisticated enough. People like me who didn’t have an engineering background but who were willing to spend a weekend learning JavaScript and could produce code fast enough didn’t really need much of a pedigree or experience.

TC: How did that experience lead to you focusing on tech that tries to understand how social media platforms are manipulated?

TC: When ISIS was employing techniques to jam conversations into social media, conversations that were elevated in the American press, we started trying to figure out how they were pushing their message. I did a little work for the Brookings Institution, which led to some work as a data science advisor to the State Department — developing counterterrorism strategies and understanding what public discourse looks like online and the difference between mainstream communication and what that looks like when it’s been hijacked.

TC: Now you’re pitching this service you’ve developed with your team to brands. Why?

JM: The same mechanics and tactics used by ISIS are now being used by much more sophisticated actors, from hostile governments to kids who are coordinating activity on the internet to undermine things they don’t like for cultural reasons. They’ll take Black Lives activists and immigration-focused conservatives and amplify their discord, for example. We’ve also seen alt-right supporters on 4chan undermine movie releases. These kinds of digital insurgencies are being used by a growing number of actors to manipulate the way that the public has conversations online.

We realized we could use the same ideas and tech to defend companies that are vulnerable to these attacks. Energy companies, financial institutions, other companies managing critical infrastructure — they’re all equally vulnerable. Election manipulation is just the canary in the coal mine when it comes to the degradation of our discourse.

TC: Yours is a SaaS product, I take it. How does it work?

JM: Yes, it’s enterprise software. Our tech analyzes conversations across multiple platforms — social media and otherwise — and looks for signs that it’s being tampered with, identifies who is doing the tampering and what messaging they are using to manipulate the conversation. With that information, our [customer] can decide how to respond. Sometimes it’s to work with the press. Sometimes it’s to work with social media companies to say, “These are disingenuous and even fraudulent.” We then work with the companies to remediate the threat.

TC: Which social media companies are the most responsive to these attempted interventions?

JM: There’s a strong appetite for fixing the problem at all the media companies we talk with. Facebook and Google have addressed this publicly, but there’s action taking place between friends behind closed doors. A  lot of individuals at these companies think there are problems that need to be solved, and they are amendable to [working with us].

The challenge for them is that I’m not sure they have a sense for who is responsible for [disinformation much of they time]. That’s why they’ve been slow to address the problem. We think we add value as a partner because we’re focused on this at a much smaller scale. Whereas Facebook is thinking about billions of users, we’re focused on tens of thousands of accounts and conversations, which is still a meaningful number and can impact public perception of a brand.

TC: Who are some of your customers?

JM: We [aren’t authorized to name them but] we sell to companies in the entertainment and energy and finance industries. We’ve also worked with public interest organizations, including the Alliance for Securing Democracy.

TC: What’s the sales process like? Are you looking for shifts in conversations, then reaching out to the companies impacted, or are companies finding you?

JM: Both. Either we discover something or we’ll be approached and do an initial threat assessment to understand the landscape and who might be targeting an organization and from there, [we’ll decide with the potential client] whether there’s value in them in engaging with us in an ongoing way.

TC: A lot of people have been talking this week about a New York Times piece that seemed to offer a glimmer of hope that blockchain platforms will move us beyond the internet as we know it today and away from the few large tech companies that also happen to be breeding grounds for disinformation. Is that the future or is “fake news” here to stay?

JM: Unfortunately, online disinformation is becoming increasingly sophisticated. Advances in AI mean that it will soon be possible to manufacture images, audio and even video at unprecedented scale. Automated accounts that seem almost human will be able to engage directly with millions of users, just like your real friends on Facebook, Twitter or the next social media platform.

New technologies like blockchain that give us robust ways to establish trust will be a part of the solution, if they’re not a magic bullet.

Read more: