Ghost: Free Postgres For Agents
Agents are desperate for ephemeral databases.
They spin up projects, fork environments, test ideas, and tear them down. Over and over. But every database on the market was designed for humans who provision once and stick around. Agents don't work that way.
Ghost is a database built for agents. Unlimited databases, unlimited forks, 1 TB of storage, and 100 compute hours per month. All free. Try it here.
When did you last actually read a fintech app's terms and conditions before clicking 'Allow access to my bank account'? Not skim — actually read it. If you're being honest, the answer is probably never. And you work in compliance.
That single click is what this edition is about.
THE SCENARIO
Meet Denise
Denise is 38, a marketing manager at a mid-sized insurance firm in Charlotte. She recently used a popular fintech app called BrightPath to apply for a kitchen renovation loan. She did what millions do every day: clicked 'Connect your bank account,' scrolled past the consent screen, checked the box, and moved on.
Within seconds, BrightPath had 24 months of her full transaction history — every direct deposit, every medical copay, every subscription charge, and every payment to a debt consolidation service she'd been quietly using for two years. Something she hadn't shared with anyone professionally.
What Denise didn't know: BrightPath used a data aggregator to pull that data. An intermediary she'd never heard of, operating across dozens of platforms — including some in the employer benefits space. Two months later, her company rolled out a financial wellness benefit through an HR tech vendor running on that same aggregator network. The vendor's AI could now detect signals consistent with financial distress, including patterns that match debt consolidation activity.
Denise never consented to any of that. She consented to BrightPath.
This isn't a hypothetical. It's the actual architecture of open banking today.
THE REGULATORY LANDSCAPE
A Rule That's Been Waiting Fifteen Years
On October 22, 2024, the CFPB finalized the Personal Financial Data Rights Rule under Section 1033 of the Dodd-Frank Act — a provision that had sat dormant since 2010. The rule gives consumers enforceable rights to access and share their financial data held by banks, credit card issuers, and digital wallet providers.
At a minimum, covered institutions must make available 24 months of transaction history, account balances, payment initiation information, including routing numbers, upcoming bill schedules, and account terms. That's not just a balance. That's a comprehensive financial biography.
The compliance timeline ran by asset size: the largest institutions — banks over $250 billion in assets and large non-banks over $10 billion in receipts — faced an April 2026 deadline. Mid-size banks from $10 billion to $250 billion had until April 2027. Smaller institutions from $850 million to $10 billion had until April 2028. Community banks below the SBA threshold were exempt from the mandatory API buildout, though still subject to GLBA and applicable state law.
As of this writing, a federal court has stayed implementation pending litigation. Forcht Bank and a coalition of banking groups filed suit in the Eastern District of Kentucky the same day the original rule was issued. The CFPB reopened rulemaking in August 2025. JPMorgan, meanwhile, signaled plans to charge aggregators for API access and later reached a quiet deal with Plaid.
Regulatory uncertainty is not a reason to wait. JPMorgan reported 1.89 billion data requests from third-party middlemen in a single month in 2025, with internal analysis showing only 13% tied to a consumer actively initiating a transaction. The other 87% were background pulls — continuous, largely invisible. The data flows are not waiting for the rule to be finalized.
THE PROBLEM STRUCTURE
Three Parties. One Data Flow. The Gap Is in the Middle.
Most people picture open banking as a two-party transaction: a consumer connects their bank account to an app. The reality has a third layer that changes everything.
The data provider (the bank) must make covered data available through a secure API. The authorized third party (the fintech app) is what the consumer actually interacts with — it may only collect data necessary for the requested service, must cap retention at one year, and must honor revocation promptly. The data aggregator — Plaid, MX, Finicity/Mastercard Open Banking, Yodlee — extracts and normalizes financial data at a massive scale. And as the Federal Reserve Bank of Kansas City documented, is not currently subject to direct federal examination by any agency.
That's where the accountability gap lives. Third parties may also pass data further — to sub-processors, analytics vendors, marketing partners — moving the consumer's financial information further and further from the context in which they originally shared it.
FRAMEWORK ANALYSIS
Five Places Where Open Banking Gets Privacy Wrong
I apply Helen Nissenbaum's contextual integrity framework to this analysis — a lens that asks not just whether a data flow is legally permitted, but whether it's appropriate given where the information originated. Five parameters. Denise's situation breaks each of them.
CI Parameter 1 — Context — Where Was This Information Born?
Banking is a high-trust, regulated context: GLBA protections, fiduciary duty, and federal oversight. When Denise shared her transaction data with her bank, she did so inside that context. The moment she clicked 'Allow,' her data migrated to a commercial fintech context governed by terms of service and business models built on monetizing financial insights. The GLBA protections don't follow the data.
The risk: Consumers have no reliable way to evaluate whether the receiving context offers equivalent protections — because it usually doesn't. Clicking one button crosses a regulatory border that most people don't know exists.
What to do: Privacy notices must explicitly explain context migration — not just 'we will share your data with a third-party app.' What changes legally and practically when data leaves a regulated banking relationship need to be disclosed. Consent without that explanation isn't meaningful consent.
CI Parameter 2 — Actors — Who Are All the People in This Room?
When Denise authorized BrightPath, she had no idea an aggregator was a discrete actor in that chain — much less that the same aggregator served her employer's HR tech vendor. The disclosure called them 'our trusted data access partners.' Seven words for a company holding 24 months of its financial life.
The risk: Consumers cannot evaluate who has their data, what obligations those parties carry, or who to contact when something goes wrong. Opaque actor chains aren't just a transparency problem — they're a consent problem.
What to do: The CFPB rule requires authorization disclosures to name data aggregators specifically. Pull up the actual screen a consumer sees when they authorize access — not your vendor contract, not your privacy notice. The consumer-facing flow is what matters here.
CI Parameter 3 — Attributes — What Is This Data Actually Revealing?
When consumers authorize access to their 'transaction history,' they picture their balance and recent purchases. What they're actually authorizing is a 24-month behavioral biography. Every medical copay reveals health conditions. Every legal services payment suggests legal trouble. Debt consolidation payments signal financial stress. Certain payment patterns can indicate pregnancy. These aren't inferences that the average consumer thinks they're disclosing when they connect a bank account to get a better loan rate.
The risk: Generic category labels like 'transaction data' obscure the inference depth of what's actually being collected. Technically, real consent built on practically hollow disclosure is still a consent failure.
What to do: Privacy impact assessments must examine what data reveals — including what can be inferred — not just what category it belongs to. If your disclosure says 'we access your transaction history' without explaining what that history reveals about health, finances, and personal circumstances, that's a gap.
CI Parameter 4 — Transmission Principle — Is the Data Being Used the Way You Said?
The stated principle is consumer-directed, purpose-limited authorization. In practice, it breaks down into two ways. First, fewer than 9% of consumers actually read privacy policies — the CFPB cited this in its own rulemaking record. Second, JPMorgan's internal analysis found 87% of aggregator API calls in a single month were background pulls with no consumer actively doing anything. One-time consent was triggering continuous, invisible collection. That's not what people agreed to.
The risk: The transmission principle is honored in contract language and violated in operational reality on an enormous scale.
What to do: Audit actual data pull frequency for every aggregator connection your institution maintains. Compare it against what was disclosed at authorization. Where the operational pattern doesn't match the disclosure, that's your regulatory exposure — and it's measurable.
CI Parameter 5 — Subject Expectations — What Did Denise Actually Expect?
Denise expected a better loan offer. She didn't expect an aggregator to retain a separate copy of her data. She didn't expect her employer's HR platform to detect her financial stress. And she certainly didn't expect that revoking BrightPath's access might leave the aggregator's copy intact. None of that was on the consent screen. None of it was in the terms she didn't read.
The risk: The expectation gap between what consumers believe they authorized and what they actually enabled is the central privacy failure of the current open banking ecosystem.
What to do: Run a 'reasonable consumer' test on every open banking disclosure. Show it to someone outside your compliance team and ask: What do you think will happen to your data? Where the expectation doesn't match reality, that's your compliance priority.
THE ANGLE MOST PEOPLE ARE MISSING
When Open Banking Enters the Workplace
Open banking is almost always framed as a consumer banking issue. That framing misses something important. Financial wellness benefits, earned wage access programs, emergency savings tools, and 401(k) optimization platforms almost universally connect to employees' personal bank accounts through data aggregators. When an employee uses an employer-sponsored financial app, their personal financial data is touching the employer's vendor ecosystem — and most employees have no idea.
More alarming: some HR tech vendors use that financial data to generate employee financial stress scores, framed as 'targeted support.' But financial distress data is a proxy for protected class characteristics. Medical debt signals health conditions. Debt consolidation activity surfaces financial stress. Certain payment patterns can indicate pregnancy. Using this data in an employment context, even with good intentions, creates discrimination exposure that most organizations haven't thought through.
'Voluntary' consent in an employment relationship is often a fiction. Employees reasonably fear adverse consequences for declining 'optional' benefits.
Banking is confidential and arm's-length. Employment is a power relationship. Standard consent models don't hold up when those two contexts collide.
CASE STUDY
$58 Million and 98 Million Users
The Plaid class action settlement — $58 million, 98 million affected users, finalized in 2021 — is the most instructive data point in open banking privacy history.
The allegation: Plaid obtained users' financial credentials under the guise of connecting to a specific app, then used them to collect far more data than consumers had authorized and shared it beyond what they had agreed to.
Run it through contextual integrity, and every parameter activates. Consumers thought they were connecting to a specific app; Plaid was an invisible intermediary. The data collected was far more than 'connect your bank account' implied. The purposes went beyond what anyone authorized. All five parameters, all at once, for 98 million people.
This happened before Section 1033 existed. The legal exposure has only increased since, from the CFPB framework, state AG enforcement, and a plaintiffs' bar that has learned exactly how to litigate these cases.
PRACTICAL ACTION
What to Actually Do
For Privacy Officers and Compliance Managers
– Start with a vendor inventory — document every third party that touches consumer or employee financial data through any API or aggregator relationship.
– Map each vendor relationship against all five contextual integrity parameters. Where the data flow doesn't match the original context, you have a disclosure gap.
– Verify the consumer-facing disclosures — pull up the actual authorization screen, don't rely on what your vendor contracts say the consumer sees.
– Build revocation audit trails — when a consumer revokes authorization, document that data collection stopped and covered data was deleted. Downstream deletion obligations need to cascade.
For Third-Party Risk Management Teams
– Update vendor contracts for open banking with explicit provisions covering data minimization, one-year maximum retention, secondary use prohibitions, aggregator sub-processor identification, and deletion obligations that cascade downstream.
– Treat aggregators as high-risk vendors. They are not subject to direct federal examination. Your due diligence is the primary control.
For HR and Benefits Leaders
– Audit every financial wellness tool your organization offers. Identify whether any employee-facing benefit connects to personal bank accounts through an aggregator.
– Rethink what 'voluntary' actually means. If an employee might face any adverse consequence for declining a financial wellness benefit — even an unstated one — you don't have voluntary consent. You have coerced consent dressed up as an opt-in.
CLOSING THOUGHTS
When Denise clicked 'Allow,' she wasn't making a financial privacy decision. She was trusting that someone in the system had done the work to ensure her data would be handled with care. That trust isn't naive. It's completely reasonable. And it's what makes getting this right matter.
Open banking can deliver real benefits — competitive rates, genuine portability, actual competition in markets incumbents have dominated for decades. But those benefits only materialize if the underlying infrastructure deserves the trust people place in it. Right now, in too many cases, it doesn't.
Contextual integrity gives us the framework to close that gap — not as an academic exercise, but as a practical audit tool. When data flows match the context where information was born, when all actors are visible and named, when data's full inferential depth is disclosed, and when transmission principles are actually enforced in operations — that's what trustworthy open banking looks like.
We're not there yet. Closing the gap is the work in front of us.
About This Newsletter
Remote Work Privacy Insights examines the evolving intersection of workplace monitoring, employee privacy rights, and emerging AI governance through the lens of contextual integrity theory.
DISCLAIMER: This newsletter is an independent publication applying academic privacy frameworks to real-world compliance challenges. Nothing here constitutes legal advice.

