How SSE Pricing Actually Works

A plain-English guide to the end-to-end pricing process — what happens, why it's painful, and why we're rethinking it.

The Big Picture

What is this process, and why does it exist?

Every month, SSE needs to figure out how much to charge customers for natural gas across multiple states — Georgia, Illinois, Ohio, Pennsylvania, and Michigan. That sounds simple, but it's not. The price customers pay is built from layers of inputs: the raw commodity price on the market, the cost to transport it, local fees and tariffs, weather-related adjustments, and business margin decisions.

The "Development of Pricing Process" is the entire journey from "what did natural gas cost on the open market today?" all the way to "this is the rate showing up on a customer's bill."

Think of it like a restaurant pricing its menu. The chef doesn't just look at what the ingredients cost. They factor in what they paid the supplier, shipping costs, storage, prep labor, local taxes, what competitors charge, and what profit margin the owner wants. Then someone has to actually type those prices into the register and make sure the menu boards are correct. That's essentially what this process does — but for natural gas, across five states, with regulatory oversight and SOX compliance requirements.

The 6 Steps, in Order

Each step feeds into the next, like an assembly line
A. Get
Market Prices
B. Add
Extra Costs
C. Calculate
Price Curves
D. Build
Final Pricing
E. Marketing
Approves
F. Upload
to Billing

The outputs of each step become the inputs for the next. If something is wrong in Step A, every subsequent step inherits that error. If someone makes a typo in a spreadsheet in Step B, it could flow all the way through to a customer's bill in Step F.

What Happens in Each Step

Click any step to expand the details
A

Market Settlement — "Get the raw prices"

What's happening: Every day, the Risk Management team pulls natural gas market prices from two external sources — Platts (a pricing service) and ICE (an exchange). These prices arrive as data files via FTP (basically, automated file transfers). The team then loads those prices into Openlink, the company's trading and risk management system. There's also a monthly process where they download city gate index prices from a website and load those into Openlink too.

In addition, they handle volatility uploads — these are options pricing data that also gets pulled from files and loaded into the system. All of this establishes the "base price" that everything else builds on.

Why it's painful: Almost every step involves manually opening a spreadsheet, copying data, and pasting it into another spreadsheet or system. The team works with 6+ spreadsheets just on this one page. The files arrive by FTP, but then a person has to manually check that they arrived, open them, copy specific data into specific templates, and paste it in the right place. One wrong paste and the base prices for the entire company are wrong.
Key risk: None of these spreadsheets have protections on their formulas or cell references. That means anyone could accidentally break a formula, overwrite a link, or corrupt the data — and there's no built-in safeguard to prevent it.
B

Cost Adders — "Stack on the extra costs"

What's happening: The base commodity price (NYMEX) is only part of what a customer pays. On top of that, there are transportation costs, pipeline tariffs, local delivery charges, fuel costs, weather adjustments, and various fees. Different teams compile these "cost adders" for each market — Georgia gets its own model, Illinois gets its own, and the expanded markets (OH, PA, MI) get theirs.

People from Business Development gather tariff data from pipeline company websites (Sonat, Transco). Physical Operations provides basis differentials and fuel data. FP&A compiles load shape and weather data. All of these inputs funnel into a cost adder model for each market.

Why it's painful: Three separate markets means three nearly identical processes running in parallel — each with its own spreadsheet, its own set of manual inputs, and its own people. The Georgia model, the Illinois model, and the expanded markets model all follow roughly the same logic, but they're maintained as completely separate files. That means tripled maintenance, tripled opportunity for error, and tripled effort every time something changes.
Key risk: Someone from one team manually types a tariff number from a website into a spreadsheet. There's no automated check that they typed it correctly. Multiply that by three markets, and you see the risk surface.
C

Weighted Average Strips — "Calculate the price curves"

What's happening: Once we know the monthly settlement prices (from Step A), the FP&A team calculates "weighted average strips." In plain English, this means: "Based on what gas costs over the next several months and how much gas each customer is expected to use, what's the blended average price we should plan around?"

A separate team called Quantitative Analytics provides usage-per-customer data (how much gas each customer type typically uses). This data changes only about twice a year. The FP&A team plugs it into their pricing models to weight the price curves appropriately.

Why it's painful: This step is almost entirely copy-paste. The analyst receives the NYMEX settlement email at 6pm, then opens the Monthly Settlement spreadsheet, copies prices out of it, pastes them into the GA Pricing spreadsheet, waits for the model to calculate, then copies the output and pastes it into yet another spreadsheet (NYMEX & Strips). Repeat for Illinois with a different set of files. Every handoff is a manual copy-paste between spreadsheets.
D

Data Analytics Pricing — "The central hub"

What's happening: This is the most complex step. The Data Analytics team pulls together everything — the cost adders from Step B, the weighted average strips from Step C, the market data from Step A — and builds gross margin analysis files for each state. These files answer the question: "If we charge this price, what's our margin after all costs?"

From there, recommended prices are assembled into a Final Pricing Deck. That deck goes through multiple approval stages: a Directors meeting reviews it, a Hedge Committee approves it, and for some states, the approved prices get submitted to regulatory authorities. Prices also get published to market websites and portals where customers can see available rates.

This step also handles B2B pricing (rates for business customers) and manages the back-and-forth with Marketing on rate code validation.

Bright spot: There's already a bot here that automatically compiles data from source feeds, populates the gross margin analysis files, runs validation checks, and emails Marketing. This is the one place in the entire process where automation exists, and it works well. It's a model for what could be done elsewhere.
Why it's painful: Even with the bot, there are still a huge number of manual steps — transferring approved pricing into check files, copying "barrels" between spreadsheets, managing rate code validation by comparing spreadsheet data to what's in the billing system (Prime), and coordinating approvals via email. This page alone references 10+ distinct spreadsheets.
E

Price Determination — "Marketing decides the final price"

What's happening: The Marketing team receives the gross margin analysis models from Data Analytics and uses them — along with competitor pricing and market intelligence — to decide the actual prices customers will see. They're not just rubber-stamping the numbers; they're making strategic pricing decisions.

For Georgia, Marketing compiles cost adders, NYMEX strips, and budgeted margins into one consolidated view, reviews competitors, and determines pricing. For Illinois and expanded markets, they review separate models that arrive on different dates throughout the month (PA on the 15th/16th, MI on the 20th/21st, etc.).

Marketing also generates promo codes — special pricing offers for customers — for both Georgia and Illinois.

Why it's painful: Models arrive at different times for different markets, so this isn't one clean process — it's a staggered set of reviews happening across the month. Each market's data comes in via email as a spreadsheet, gets reviewed, and then results get emailed back. It's a lot of email-based coordination with no central system tracking where things stand.
F

Pricing & Promo Code Upload — "Put the prices in the billing system"

What's happening: After everything is approved, someone has to actually put the prices into the systems that generate customer bills. This is trickier than it sounds because there are multiple billing systems:

Prime handles Illinois and Expanded Markets (OH, PA, MI). Safari handles B2B rates for Georgia and FNG. Banner handles Georgia GNG retail rates, and it's managed by a third-party vendor called Vertex.

For each system, the team first uploads prices to a test environment, validates that everything looks correct, generates approval reports, gets sign-off from Marketing and FP&A, and only then pushes to production (the live system). There are three formal SOX compliance controls here (RE81, RE82, RE83) — meaning these steps are legally required for financial controls.

There's also a quarterly tax rate update process where Georgia tax rates are reviewed, verified by the Tax Team, approved by directors, and updated in Banner through Vertex.

Why it's painful: The upload process for Expanded Markets alone involves uploading files to a test system, generating 4+ approval reports (Reports 64, 65, 67, comparison reports), emailing them to multiple people for review, getting approval emails back, then saving to production, then generating more reports to confirm everything saved correctly. It's a multi-day process of uploading, reporting, emailing, waiting, and re-checking.
Key risk: Because there are multiple billing systems with different interfaces and processes, it's possible for prices to be correct in one system but wrong in another. The Georgia rates go through Vertex (a third party), which adds a layer of coordination where the team can't directly control what gets entered.

Why This Process Is Difficult

The four themes that keep coming up
30+
Spreadsheets
Over thirty individually maintained spreadsheets pass data between teams. Each one is a potential point of failure — a broken formula, a stale link, a wrong paste.
0
Formula protections
Control #4 explicitly flags that none of these spreadsheets prevent formulas from being edited or links from breaking. This is a known gap.
6+
Departments involved
Risk Management, Business Development, Physical Ops, FP&A, Data Analytics, Marketing, Customer Ops, Leadership, Tax, and a third-party vendor all touch this process.
~20
Manual handoffs per month
Most handoffs are someone emailing a spreadsheet to someone else, who copies data from it into another spreadsheet, and emails the result to the next person.

Why We're Rethinking This

The case for change
⚠️

Risk Exposure

With 30+ unprotected spreadsheets, a single accidental edit could flow incorrect prices all the way to customer bills. The current process relies on people catching errors, not systems preventing them.

⏱️

Speed

The end-to-end process takes most of the month. Prices arrive at 6pm, analysts manually copy-paste through multiple files, approvals happen via email, and uploads take days. There's very little room for error correction.

🔄

Duplication

Three markets run nearly identical workflows with separate spreadsheets. The same logic is maintained three times. When a process improvement is made, it has to be replicated across all three — and often isn't.

The goal isn't to scrap everything overnight. It's to look at this process with fresh eyes and ask: where are we doing work that a system could do for us? Where are we passing data by email that could flow automatically? Where are we checking things manually that could be validated by a rule?

The bot that already exists in Step D proves this can work. It compiles data, validates it, and emails the results — automatically. The question is: what would this process look like if that approach were applied more broadly?

Glossary — Terms You'll Hear in the Room

Quick reference for the systems, acronyms, and jargon
NYMEX
New York Mercantile Exchange — the market where natural gas futures are traded. The "NYMEX price" is the base commodity price.
Openlink
The company's trading and risk management system. Market prices get loaded here and it's the system of record for settlement prices.
Prime
The billing system used for Illinois and Expanded Markets (OH, PA, MI). Prices and promo codes are uploaded here.
Safari
The billing system used for B2B (business-to-business) rates, specifically for GA and FNG. Also called "Rate Maintenance."
Banner
The billing system for Georgia GNG retail rates. Managed by Vertex, a third-party vendor — the team doesn't directly enter rates here.
Vertex
A third-party vendor that manages Banner. They enter approved rates into the test environment, team verifies, then Vertex pushes to production.
Cost Adder
Any cost layered on top of the base NYMEX price — transportation, tariffs, fuel, fees, weather adjustments, etc.
Weighted Average Strip
A blended forward-looking price that accounts for expected customer usage patterns across future months. Used for planning and pricing decisions.
GM Analysis
Gross Margin Analysis — spreadsheets that show the margin between what customers pay and what the gas costs (including all adders). Built per state.
FP&A (DA)
Financial Planning & Analysis, Data Analytics team. The central pricing coordination team that builds the Final Pricing Deck.
VEDO Check
A validation step where Marketing sends pricing upload files to FP&A DA to review for accuracy and completeness before final approval.
SOX Controls
Sarbanes-Oxley compliance requirements. Controls RE81, RE82, and RE83 are legally required approval gates before prices go live.
LDC
Local Distribution Company — the utility that delivers gas to the end customer. Different LDCs serve different regions.
S Drive
A shared network drive where most spreadsheets are stored. Access is controlled by "Cool Compliance" security.
MDT Mapper
A tool used to push prices into Openlink. Part of the monthly settlement loading process.
IFF Composite Curves
Calculated forward price curves that are loaded into Openlink. Built manually using the EndureCurveDumpMDW and IFF Calculator spreadsheets.
Platts / ICE
External data sources. Platts provides daily gas settlement prices. ICE (Intercontinental Exchange) provides forward curves and options data.
Basis
The price difference between the NYMEX benchmark and the actual delivery point. Varies by location and is a key cost adder component.