Eastern Analytics Blog - Project Spotlight

Leveraging Microsoft Azure to Collect, Analyze and Predict the Success of Stock Option Trades – Part 1

Leveraging Microsoft Azure for Successful Stock Option Trades - Part 1

  • June 1 2023
  • Eric Pietroski
Blog Details

Overview:  Stock options traders can utilize vertical spreads in order to turn a profit while limiting risk. In the case of a vertical call spread, the trader purchases a call option while also selling a call option at a higher strike price for the same expiration date. As time passes the options trader benefits from the time decay of the options in order to make a profit.

For example, if a stock is trading at $100 per share, it may be possible to buy a call option that expires 3 weeks in the future at a strike price of $90 and simultaneously sell a $95 strike call for the same expiration. This purchase represents a $5 spread ($95-$90 = $5). Depending on the volatility of the underlying stock, it may be possible to purchase this spread for $3.80. If the stock price closes above $95 on expiration date the trade would realize a full profit. The full profit would be $1.20 per share (spread value of $5 - cost $3.8 = $1.2) or 31.57% (profit $1.20 / cost $3.80 = .3157). Additional information about Vertical Option spreads can be found here.

Identifying profitable trades can be a challenging and time-consuming task, even with a strategy in place. My own approach involved identifying a stock that I believed was well-suited to the strategy, researching options chains for various expirations, and running numerous calculations for various strike price combinations to determine potential returns. As you can imagine, this was a time-intensive process. 

To streamline this process, I developed a spreadsheet that leveraged an external API to pull real-time data for all options associated with a given stock symbol and expiration date. With this data, the spreadsheet then calculated the returns for all option combinations and displayed them as a percentage, providing a much more efficient and effective means of identifying profitable trades.

 

Excel Profitable Trades

 

To further simplify the process, I used a shading technique in Excel to identify any cells indicating a profit greater than 25%. Finding profitable trades wasn't always easy, however, and often required multiple attempts with various stocks before identifying a viable option.

 

Automating the Process with Azure:

As I was exploring the potential of both bullish (Call) and bearish (Put) spreads, I realized that it would be valuable to know which stocks offered the best returns on any given day. By running my spreadsheet for multiple stocks simultaneously, I could quickly identify which ones were currently offering great returns. And by storing the data for further analysis, I could also uncover the characteristics of winning trades to presumptively improve my future trading strategies.

With these goals in mind, I decided to build out the process in Azure. The automated process outlined below describes how I leveraged Azure to collect, analyze and potentially predict the success of stock option trades.

 

System Implementation:

 

System Implementation

 

Solution Components:

Azure Data Factory (ADF) – A data integration service used to copy data and orchestrate activities via pipelines

Azure Data Lake Storage (ADLS2) – An Azure storage account providing blob, file share, and table storage.

Azure SQL Database – A relational database-as-a-service (DBaaS) hosted in Azure.

Power BI – A unified, scalable platform for self-service and enterprise business intelligence (BI).

 

Data Flow Description:

Pipelines were built within Azure Data Factory (ADF) for the purpose of orchestrating the movement of data from external sources through processing steps within Azure.

The following is an outline of those steps with numbering that corresponds to the diagram above:

Step 1:  Data is pulled from external APIs utilizing both ADF and Databricks depending on the need / complexity of the operation. In the case of external APIs that utilize authorization tokens Azure Key Vault was utilized to store those values as a secret. This approach increased security and eliminated sensitive information from being accessible within the code itself.

Step 2:  Data obtained in the first step is moved into an Azure SQL database in its raw form.

Step 3:  Once the data is landed in Azure SQL the data is processed using a multi-step medallion architecture. As data moves through the processing layers it is transformed and calculations are performed. These calculations help to determine if a given options chain contains any actionable trades. Additional calculations are performed as well for use in further analysis.

Step 4:  On option expiration day, an automated process is run that pulls the closing quote for the underlying stock / equity. The closing price is then used to update all option spreads that expire on that day with a status of ‘Success’ or ‘Fail’. These updated trades are referred to as ‘mature’ trades.

Step 5:  Data can now be refreshed in Power BI in order to analyze the characteristics of successful trades. Data is provided through an SQL view within the Azure SQL database. Some additional data manipulation is done within the Power BI data model in order to provide better analysis.

 

Solution Details:

Azure Data Factory

Azure Data Factory (ADF) is utilized as the orchestration tool for each of the pieces of the process. All activities are run on a scheduled basis allowing for the capture of quickly changing data as a snapshot in time. Without some type of repeatable process, it would be difficult to collect an adequate dataset to perform analytics or utilize with Azure ML.

ADF Pipelines were developed in a structured manner. Parent pipelines were built to facilitate the running of “child” pipelines. This structure allowed for better code modularization, structure/organization, and reuse. When possible, parameters were also utilized in order to facilitate the reuse of common pipeline activities.


Solution Details 1

 

 

Solution Details 2

 

Azure SQL Database

The SQL database serves as an important piece to the overall process as it provides a robust environment for data storage and preparation. Initially, data is retrieved and stored in its raw form. As the data is moved through the ETL layers it is transformed utilizing stored procedures to convert data types and perform calculations.

For example, after option quotes are retrieved the process calculates the potential return of every combination of strike prices for a given expiration date for the stock targeted. This data is transformed as it is moved to the gold layer. Below is a subset of the records produced for JNJ Call Options with an expiration date of 09/09/2022. The total number of combinations produced for this one ticker at the time of creation was 253. In essence, this data is similar to the data shown in the Excel screenshot in the first section of the blog titled Overview/Business Problem.

 

SQL Data Base

 

Power BI

Power BI provides an easy-to-use and powerful platform for data analysis. Reports were built that allow for analysis of trades using filters for many of the parameters including the days till maturity, potential profit returned as a percentage, equity symbol, and option type (Call or Put). These reports allow for a detailed analysis of both historical trades as well as new trade ideas based on current collected data.

Leveraging Microsoft Azure to collect, analyze and predict the success of stock option trades-2

Conclusion:

Microsoft Azure offers a comprehensive suite of tools that enable the creation of end-to-end solutions for various applications. For this project, I was able to seamlessly perform each task across multiple integrated environments, leveraging the full capabilities of the platform. One such tool that proved particularly useful was Power BI, which provided valuable insights when analyzing trades and identifying the key characteristics of successful trades.

After the initial build-out of this process, it was decided to incorporate Azure Automated ML in order to see if a trained model may be a good predictor of successful trades. The second part of this post speaks more about that process and can be found  here.

 

Tools Utilized:

Azure Data Factory (ADF), Azure SQL Database, Databricks, Azure Storage Account, Azure Key Vault

 

Languages Utilized:

T-SQL, Python, PySpark, DAX

 

This is part 1 of a 2-part post. Part 2 will dive into more detail about utilizing the data collected with Azure Automated ML.

 

Note:

The above notes are as of the original date of this post (June 1, 2023). The functionality will continue to evolve as the products mature.

 

 

 

Eric.Pietroski@eastern-analytics.us

Written by Eric Pietroski

Technical Lead / Architect - 25+ Years of application development, Data Analytics and Transformations, MS Azure, Data Engineering, GitHub Actions, Azure Messaging Services and Azure SQL
 
Eric can be reached at:  Eric.Pietroski@Eastern-Analytics.us
 

Leave your thought here

Your email address will not be published. Required fields are marked *