Japan’s labor is tight and the yen is weak
Despite 2.3 million foreign workers, only 1,685 Indian students study in Japan
A Japan–India education corridor would scale skills, lift productivity, and lock in a deeper partnership
Rising interest bills make public spending efficiency the growth strategy
Prioritize smarter procurement, disciplined investment, and service productivity over sector wish lists
Measure, review, and reallocate to what works to free fiscal space
CRISPR cures are real, but access is narrow
High costs, fragile care pathways, and shaky investment block scale
Share risk on payment, build training and data networks, and enforce anti-eugenics rules to make gene editing access a public good
AI Finance Talent Has Ended the Old Apprenticeship. Education Must Build the New One
Picture
Member for
1 year
Real name
David O'Neill
Bio
David O’Neill is a Professor of Finance and Data Analytics at the Gordon School of Business, SIAI. A Swiss-based researcher, his work explores the intersection of quantitative finance, AI, and educational innovation, particularly in designing executive-level curricula for AI-driven investment strategy. In addition to teaching, he manages the operational and financial oversight of SIAI’s education programs in Europe, contributing to the institute’s broader initiatives in hedge fund research and emerging market financial systems.
Published
Modified
Internal AI now performs junior work, collapsing the old apprenticeship
Education must build AI finance talent—aim, audit, and explain models
Policy should fund governance sandboxes to grow trusted hybrid roles
The most meaningful metric in finance this year isn’t a trading multiple. It’s that internal, company-wide AI tools have reached a point where they handle most of the entry-level work once done by interns and junior analysts. Goldman Sachs has implemented a generative-AI assistant across the company, with around 10,000 staff already using it for document synthesis and analysis. Morgan Stanley reports nearly universal adoption of an AI assistant among advisor teams. Citigroup claims its internal AI saves about 100,000 developer hours each week. In the UK, three-quarters of financial firms use AI, with the remainder planning to follow suit. In short, the foundation of financial work is now automated in-house. This change does not lead to fewer jobs. It creates a new ladder. It also compels schools, regulators, and employers to rethink how people learn and advance. At the center of this redesign is the emergence of AI finance talent — workers who can quickly aim, audit, and explain AI — highlighting the crucial role of judgment and explanation skills in the new finance landscape.
AI Finance Talent and the End of the Old Apprenticeship
The traditional apprenticeship in banking was straightforward: new hires built models, cleaned data, drafted pitchbooks, and summarized filings until they could be trusted to make judgments. Today, internal platforms manage much of that initial work. FactSet now offers an AI Pitch Creator that saves junior bankers hours each week. UBS uses AI analyst avatars to produce research videos at scale, freeing up human resources for insights. UniCredit’s DealSync uses AI to identify hundreds of M&A leads without increasing headcount. Even when firms do not eliminate roles, the work shifts up the value chain. Entry-level tasks are decreasing; expectations are growing. The new entry-level requirement is not “clean the data” but “frame the decision.” That explains why the market premium for AI skills is real, not just talk: across different sectors, jobs that require AI skills offered a salary premium of about 28% in 2025, and over half of such postings are now outside of core IT. These roles are hybrid—finance-first, AI-savvy.
This does not indicate a decline in finance jobs. U.S. labor projections still predict growth for personal financial advisors and financial and investment analysts over the next decade. This aligns with broader evidence: where AI supports complex decision-making, employment and sales can increase together. The change we are facing is not a “no-jobs” future but a “new-jobs” mix with higher skill demands. Method note: these projections are decade-long baselines, not short-term forecasts; they assume ongoing cooperation and regulated use. For educators, the message is clear: curricula must abandon the notion that students learn by doing low-stakes grunt work. The grunt work is disappearing. We need to teach judgment from day one, opening up new opportunities for growth and advancement in the finance sector.
Figure 1: Most UK finance firms already use AI and another slice is close behind—shrinking manual entry work and raising the bar for judgment.
What AI Finance Talent Changes in the Labor Market
Three facts anchor the labor story. First, adoption has become mainstream. In the Bank of England’s latest survey, 75% of financial firms already use AI, and an additional 10% plan to do so within 3 years. Second, major banks have scaled internal platforms rather than outsourcing essential workflows to public tools. Goldman Sachs rolled out its assistant across the firm; Morgan Stanley’s wealth unit experienced 98% adoption; Citi has transitioned from pilots to company-wide improvements. Third, the capability frontier is specific to the industry and widening: Bloomberg trained a 50-billion-parameter model on financial data to enhance narrow tasks like sentiment and entity tagging, the same functions junior staff once performed. The result is a workplace where judgment matters more than keystrokes, and where AI finance talent earns a premium for guiding, auditing, and explaining machine outputs.
Skeptics may argue that displacement at the bottom still represents displacement. They are right to be concerned about early career opportunities. Some datasets show fewer listings for entry-level roles that generative AI most impacts. Yet when we look at the bigger picture, firms that invest in AI at scale often expand higher-value work and reassign staff. JPMorgan’s leadership has stated that AI could enhance nearly every job at the bank, and external reports have documented hundreds of use cases. Internalizing tools—rather than relying on vendors—maintains tighter compliance and speeds up improvements. The real risk isn’t mass unemployment in finance; it’s a lack of workers who can blend industry knowledge with model oversight and narrative skills. That’s what AI finance talent represents in practice.
Building the AI Finance Talent Pipeline in Education
Programs that prepare students for finance must reshape the learning sequence. If automation handles extraction and drafting, students must focus on selection and defense. Change “learn Excel shortcuts” to “run an AI-assisted model, then stress-test and explain its limits.” Change “compile comps” to “audit the source of comps and adjust for model bias.” Change “write a market update” to “create a decision brief that links model outputs to policy or fiduciary responsibility.” This isn’t theoretical. It reflects how leading firms operate now: assistants summarize policies, cluster filings, and draft code; humans decide what holds up with clients or regulators. Schools should develop AI finance talent by teaching students to align model outputs with real constraints—risk, regulations, and time—while stressing the need for a new approach to finance education in the face of these changes.
To facilitate that shift, universities can borrow ideas from medical training. Create supervised “AI rounds” where students rotate through desk-like simulations. In one week, they use a domain model to analyze an earnings call and produce a one-page risk memo, citing source documents. In another week, they will refine a draft pitch creator and write a three-paragraph rationale that a compliance officer could approve. In a third week, they assess how a prompt change affects the valuation range and justify it with a clear audit trail. These 'AI rounds' are designed to provide students with hands-on experience in using AI tools and interpreting their outputs, preparing them for the real-world challenges they will face in the finance industry. Method note: assessments can evaluate three aspects—evidence, calibration, and clarity—rather than just the final output, which AI can generate. The goal is fluency: can a student explain what the model did, why it might be wrong, and how to guide it? That’s the essence of AI finance talent.
Policy for an AI Finance Talent Economy
Policymakers should treat the redesign of apprenticeships as essential infrastructure. The aim is not to fund another coding bootcamp; it’s to support shared “explainable finance” environments where schools, supervisors, and middle-market firms can train on safe data and publish repeatable exercises. Central banks and regulators have already cautioned that explainability, bias detection, and human oversight are necessary. Public environments can standardize these practices earlier in careers, rather than as remedial training. Sector-wide adoption data highlight why timing is crucial. With 75% of firms already using AI—and major players continually pulling ahead—the window for inclusive talent pipelines is now, not later.
Regulators can also promote an “internal-first” approach, where appropriate, backing firms that develop or closely manage in-house tools rather than untested external systems. This is both a precautionary measure and an educational one. Keeping model behavior and data flow visible to reviewers is better for supervision and training graduates in governance. Evidence from large institutions indicates that internal platforms enhance measurable productivity and adoption at scale. This supports wage premiums for hybrid roles and creates a more straightforward pathway from student projects to professional roles. A close connection between classroom environments and firm workflows will cultivate AI finance talent more quickly and equitably than a hands-off approach.
Educators, Administrators, and the New Ladder
Educators should eliminate outdated modules that assume entry work is manual. Replace them with three new habits that align with current practices. First, always combine generation with verification. Students must check AI-generated numbers against original documents and keep a record of the check. Second, teach translation. The best junior today turns a model’s complex outputs into clear, client-ready paragraphs with decisions and caveats. Third, integrate narrative skills with quantitative skills. Finance now involves both analytics and storytelling—quick synthesis, concise writing, and a human tone. Bloomberg’s finance-focused model and UBS’s video avatars signal this shift. The content aspect of finance is becoming more like journalism, but with fiduciary responsibilities. If we train students to report and not just repeat, their value increases in any role as they move up the ladder.
Figure 2: AI skills now earn a 28% pay premium, and over half of AI-skill demand sits outside IT—evidence that hybrid finance roles are ascendant.
Administrators should assess success by placement into hybrid roles rather than by software certifications. Evidence from Lightcast showing an AI skills premium should inform program design: build capstones that have students practice governance, not just prompting. Track outcomes in risk, research, and client advisory, where adoption is strongest and the combination with human skills is most effective. Meanwhile, universities should enter into data-sharing agreements with employers that allow students to work with synthetic yet realistic datasets—such as policy libraries, anonymized filings, and redacted client notes—so graduates enter the workforce with a governance portfolio rather than just a degree.
Internal AI now manages much of the work that once taught newcomers how finance operates. This can serve as a warning or as a call to action. If grunt work is disappearing, we must replace it with structured judgment. The evidence is compelling: adoption is widespread; specialized models are emerging; wage premiums reward hybrid skills; and projections still indicate growth in advisory and analysis. The labor market will favor individuals who can refine and validate a model and communicate the results clearly. That defines AI finance talent. Our mission is to construct a ladder fit for this era: classrooms that link to real governance; apprenticeships that simulate rather than impose drudgery; curricula that emphasize judgment and culminate in impact. Finance has upgraded its tools. Now, education needs to upgrade its training. The quicker we act, the fairer the new ladder will be.
The views expressed in this article are those of the author(s) and do not necessarily reflect the official position of the Swiss Institute of Artificial Intelligence (SIAI) or its affiliates.
References
Bank of England. (2024, November 21). Artificial intelligence in UK financial services – 2024. Bank for International Settlements. (2024, June 25). Artificial intelligence and the economy: implications for central banks (BIS Annual Economic Report, Chapter III). Bloomberg. (2023, March 30). Introducing BloombergGPT, Bloomberg’s 50-billion parameter large language model, purpose-built for finance. Brookings Institution. (2025, July 10). Hybrid jobs: How AI is rewriting work in finance. Brookings Institution. (2025). The effects of AI on firms and workers. FactSet. (2025, January 15). FactSet launches AI-powered Pitch Creator. Financial News London. (2024, April 8). Jamie Dimon: AI could ‘augment virtually every job’ at JPMorgan. Lightcast. (2025, July 24). New Lightcast report: AI skills command 28% salary premium as demand shifts beyond tech. Morgan Stanley. (2024, June 26). AI @ Morgan Stanley Debrief – launch debrief. Reuters. (2025, June 23). Goldman Sachs launches AI assistant firmwide, memo shows. Reuters. (2025, May 22). Citigroup launches AI tools for Hong Kong employees. Reuters / Breakingviews. (2024, June 27). Banks grab AI-generated tiger by the tail. UBS. (2025, June). UBS deploys AI analyst avatars [News report]. Financial Times. U.S. Bureau of Labor Statistics. (2025, March 11). Incorporating AI impacts in BLS employment projections. World Economic Forum. (2023). The Future of Jobs Report 2023.
Picture
Member for
1 year
Real name
David O'Neill
Bio
David O’Neill is a Professor of Finance and Data Analytics at the Gordon School of Business, SIAI. A Swiss-based researcher, his work explores the intersection of quantitative finance, AI, and educational innovation, particularly in designing executive-level curricula for AI-driven investment strategy. In addition to teaching, he manages the operational and financial oversight of SIAI’s education programs in Europe, contributing to the institute’s broader initiatives in hedge fund research and emerging market financial systems.
Taiwan’s advanced chips make it a bargaining chip in U.S.–China–Japan rivalry
Tariffs, subsidies, and new fabs turn semiconductor leverage into day-to-day industrial policy
Education must build chip literacy and procurement buffers to withstand supply shocks
Grab–GoTo could control ~85–90% of ride-hailing, risking lock-in
Approve only with guardrails—data portability, fair access, pricing caps, driver-earnings floors—or block
Educators and ministries should bake these rules into procurement and curricula
High housing costs lock households into hand-to-mouth budgets and suppress saving
Targeted housing support and expanded affordable supply free cash for productive spending and learning
Prioritize urban renters, index aid to rents, and track overburden rates monthly
Stop the Cross-Subsidy: AI Data Center Electricity Rates Shouldn’t Raise Household Bills
Picture
Member for
1 year
Real name
Ethan McGowan
Bio
Ethan McGowan is a Professor of Financial Technology and Legal Analytics at the Gordon School of Business, SIAI. Originally from the United Kingdom, he works at the frontier of AI applications in financial regulation and institutional strategy, advising on governance and legal frameworks for next-generation investment vehicles. McGowan plays a key role in SIAI’s expansion into global finance hubs, including oversight of the institute’s initiatives in the Middle East and its emerging hedge fund operations.
Published
Modified
AI data centers are pushing grid costs onto households and schools
Create a separate rate class with minimum bills, upfront upgrade payments, and full transparency
Require self-supply or co-located power for very large campuses, with local community benefits
Wholesale electricity prices near major data center clusters have jumped by as much as 267% over the past 5 years. This increase affects utility bills, which in turn affect household budgets. During the same period, local transmission upgrades worth at least $4.4 billion were made in seven PJM states, costs now imposed on regular customers so that hyperscalers can connect more quickly. This practice is unfair. It creates a cross-subsidy from families and schools to some of the world's largest corporations. Suppose the demand for AI continues to rise. In that case, regular customers will bear more grid costs that would not exist without these large-scale developments. The solution is not just a catchy phrase but a thoughtful rate design. AI data center electricity rates must be independent, with clear boundaries that shield households from stranded assets, capacity charges, and local wires built just for single tenants. Anything less allows a wealth transfer to go unnoticed.
The unfair cross-subsidy is evident
What has changed is not just the price but also the scale. Grid planners are observing a structural shift in load expectations. PJM, the largest power market in the U.S., forecasts its summer peak will grow from about 150 GW to roughly 220 GW over the next 15 years, mainly due to data-center growth. The grid’s independent market monitor estimates that data center loads contributed $9.33 billion in capacity-market revenues over one year under a scenario in which everything else remains the same. This is a clear cost signal reflected in retail bills. Residential customers cannot protect themselves from this risk; they pay for it.
The global situation mirrors this. The International Energy Agency projects that data-center electricity use will more than double by 2030, reaching about 945 TWh, and growing nearly four times faster than overall power demand. Short-term forecasts in the U.S. indicate record electricity consumption in 2025–2026, primarily driven by data centers and AI. In the meantime, utilities in rapidly growing areas have filed resource plans and capital programs that show significant growth in large loads. These filings suggest that, unless changes are made, households will face increased rates due to higher capacity, transmission, and distribution charges. AI data center electricity rates must reflect the scale and risks of these changes rather than load costs on top of the retail base.
Figure 1: Global data-centre demand roughly doubles in four years—local costs rise if rates don’t firewall the load.
The political economy makes things more complicated. Tech companies often request power faster than permits and construction can keep up. Interconnection queues expand with requests well beyond the projects that will actually be built. When utilities prepare for maximum demand but actual demand is lower or arrives later, ratepayers end up covering the costs of unused assets. The risk is clear: we share the downside and privatize the upside. This isn't an argument against AI; it’s an argument for setting AI data center electricity rates based on the actual costs and risks of demand, before new contracts lock in poor practices for 20 years.
A firewall for AI data center electricity rates
The solution starts with creating a separate rate class and establishing a tariff structure. Some regulators and legislators are beginning to do just that. One major utility has suggested a new rate class requiring long-term commitments (around 14 years) and minimum demand charges across transmission, distribution, and generation to avoid cost shifts if data center usage falls short. Analysts and policy experts recommend similar approaches: separate AI data center electricity rates with monthly minimums linked to contracted capacity; upfront contributions for grid upgrades; extended terms; and exit fees to protect other customers from stranded costs. States like Maryland and Oregon have gone even further, requiring or allowing specific rate schedules for data centers and establishing a separate class for large users. The direction is clear: place costs on those responsible for them.
The firewall tariff should be straightforward and consistent. First, minimum monthly bills should reflect the full, long-term costs of requested capacity, including local wires. Second, require upfront payments or contributions for custom upgrades to ensure ordinary customers aren't shouldering the burden. Third, establish credit standards, contract lengths, and exit fees that align with asset lifespans. These measures are not punitive; they reflect practices in generation interconnection and industrial rate design. They also align with what grid monitors already observe: in PJM, the rise in large-load interconnections and capacity pricing has noticeable effects on consumers. The goal is to make AI data center electricity rates self-sustaining, not to stifle innovation.
Fairness also requires transparency. Utilities should provide a rolling account of data-center-driven capital—by project, substation, and cost category—and track recovery against the data-center tariff instead of general rates. Where state open records laws allow, commissions should mandate deal-level disclosures of capacity reservations and associated community protections. This will empower school districts, city councils, and small-business associations with the information they need to act before costs appear in future rate cases. The choice is between targeted AI data center electricity rates now or widespread financial pain later.
If you build it, power it yourself
Rate design alone is not sufficient. The most effective way to prevent cross-subsidies is to co-locate large loads with generation or to require self-supply for campuses above a defined threshold. Federal regulators are officially reviewing colocation issues for large loads, such as AI data centers, beginning with PJM. This approach is essential. Suppose a campus seeks hundreds of megawatts on a tight timeline. In that case, it should not trigger off-site wires and peaking plants funded by the larger community. Better models exist: merchant generation combined with long-term energy service agreements; on-site renewables and storage sized to meet load needs; or gas units near the campus priced within the private contract rather than general rates.
Market players are already making progress. Major utilities and investors are forming partnerships to build dedicated gas plants to serve data-center clusters under long-term contracts. While this doesn’t settle the climate debate, it does align incentives: those benefiting pay for the asset. These deals should include strict guidelines. The new generation should not be part of the regulated rate base unless it supports the broader system. There should be a clean-energy transition or renewable PPAs that increase with use.
Most importantly, prohibit the hidden recovery of dedicated campus assets through general riders. Suppose the business case for a 500 MW campus is sound. In that case, its owners should account for energy and capacity costs in their own financial statements. This is how we handle other large loads, and it should apply to AI as well.
Zoning and siting policy should align. Jurisdictions that accept data centers can require community benefits agreements tied to energy usage—funds designated for school energy upgrades, community solar subscriptions, and bill relief in host areas. These payments should be mandatory for projects that cause new substations or long feeders. They should scale with reserved capacity, not just square footage or headcount. Where regional reliability margins are low, local planners should insist on self-supply or co-location as conditions for approval. This prevents AI data center electricity rates from affecting everyone else's bills.
Translate fairness into rules we can enforce
What should educators, administrators, and policymakers do right now? First, engage early in rate cases and resource plans. When a utility files an integrated resource plan citing “extraordinary” growth driven by data centers, school districts, and universities, those entities should participate in the case. They should demand a separate class, minimums, and a ledger of data-center capital—not a commitment to reconcile later. Many states already have filings and press materials predicting extraordinary load and related expansions; the public record is clear enough to justify immediate action.
Figure 2: Large-load growth—data centres included—pushes PJM’s peak from ~150 GW to ~220 GW; without a separate class, households absorb expansion costs.
Second, impose non-bypassable charges on the data-center class for local transmission upgrades. A recent review showed customers in seven PJM states were billed $4.4 billion for local data-center transmission projects approved in just one year. This illustrates the flow of costs when there's a regulatory gap. Commissions can close this by assigning specific costs to the customer responsible—just as they do for generator interconnections. If the tariff needs adjustment, make it now and prospectively.
Third, improve planning practices. Load requests are uncertain; many never materialize. Utilities should not build based on the most optimistic scenarios without solid minimum-bill protection. Require long-term commitments, strong credit support, and exit fees that cover the life of local wires. One major utility's proposal accomplishes this—offering a new class for high-energy users with 14-year commitments and minimum demand requirements—while national consultants recommend the same toolkit to protect residential customers. States are starting to legislate these directions, necessitating specific tariffs for data centers with clearly defined financial responsibilities.
Fourth, ensure consumer protection in the short term. Where bills are already rising, immediate assistance should focus on schools and low-income households most affected by increasing grid costs. Data-center hosts can provide local rate relief through impact fees and community benefits linked to reserved megawatts. Regulators can limit pass-throughs from data-center-related projects until a separate class is established. When new capacity auctions or transmission surcharges appear on bills, commissions should require a public breakdown of how much is caused by data centers. People have a right to understand the cause-and-effect relationship through precise numbers.
Finally, create a co-location pathway by default. FERC’s inquiry into rules for generator-load co-location enables PJM and other RTOs to follow suit. States can set a target from the governor’s office: any campus above a certain threshold—perhaps 100 MW—must self-supply, co-locate, or sign a full-requirements contract that shields its costs from general rates. If providers want public connections, they must agree to AI data center electricity rates that reflect all costs—without exceptions.
The numbers we started with should shape decisions for the next two years. Wholesale prices near data-center hubs are up as much as 267%. Billions in local wires have been approved and charged to customers. Capacity revenues have surged by billions as large loads have suddenly appeared. If left unchanged, this trend will continue. Families, schools, and small businesses will pay more for assets they neither wanted nor needed when nearby campuses reduce usage. We can prevent that future by changing the default. Give data centers their own AI-based electricity rates, with minimum bills reflecting contracted capacity. Require pre-payments for custom wires. Urge huge campuses to co-locate or self-supply, keeping their assets off the regulated rate base. Then ensure transparency so communities can see the accounts in detail. The message is transparent and fair: support the AI economy without making neighbors pay for it. The sooner we establish these rules, the sooner we halt the hidden transfer seen in monthly bills.
The views expressed in this article are those of the author(s) and do not necessarily reflect the official position of the Swiss Institute of Artificial Intelligence (SIAI) or its affiliates.
References
Bloomberg News. (2025, September 29). AI Data Centers Are Sending Power Bills Soaring (methodology note cites Grid Status and DC Byte). Retrieved November 5, 2025. Brookings Institution. (2025, October 30). Boom or bust: How to protect ratepayers from the AI bubble. Retrieved November 5, 2025. Deloitte. (2025, June 24). Can U.S. infrastructure keep up with the AI economy? (recommends separate rate class, minimum charges, exit fees). Dominion Energy. (2025, April 1). Dominion Energy Virginia proposes new rates… (new rate class; 14-year commitments; consumer protections). FERC. (2025, February 20). FERC orders action on co-location issues related to data centers running AI (PJM focus; reliability and fair costs). IEA. (2025, April 10). AI is set to drive surging electricity demand from data centres… (base-case ~945 TWh by 2030; ~15% CAGR in data-center electricity). PJM. (2025, January 30). 2025 Long-Term Load Forecast Report Predicts Significant Increase in Electricity Demand (summer peak path to ~220 GW). PJM Independent Market Monitor. (2025, June 25). Market Monitor Report (capacity revenue impact of data-center load; $9.33 billion scenario). Reuters. (2025, June 10). Data center demand to push U.S. power use to record highs in 2025, ’26, EIA says (STEO record consumption). Reuters. (2025, July 15). Blackstone and U.S. utility PPL to build gas power plants in JV partnership (dedicated generation for data centers). Utility Dive. (2025, October 1). Customers in seven PJM states paid $4.4B for data center transmission in 2024: report (UCS findings; regulatory gap). Virginia Mercury. (2025, September 3). Dominion proposes higher utility rates, new rate class for data centers (details on minimum demand obligations; class scope).
Picture
Member for
1 year
Real name
Ethan McGowan
Bio
Ethan McGowan is a Professor of Financial Technology and Legal Analytics at the Gordon School of Business, SIAI. Originally from the United Kingdom, he works at the frontier of AI applications in financial regulation and institutional strategy, advising on governance and legal frameworks for next-generation investment vehicles. McGowan plays a key role in SIAI’s expansion into global finance hubs, including oversight of the institute’s initiatives in the Middle East and its emerging hedge fund operations.