Best practices for AI financial data analysis: 3 steps to create your own “after-hours military division”, even if you lie down

In the field of financial investment, investors often feel at a loss in the face of massive data and complex market fluctuations. This article will reveal to you how to use AI technology to create an exclusive “after-market military division”. By building a multi-dimensional “TAV” technical analysis framework, combined with Python scripts and HTML templates, the automated process from data acquisition to decision-making recommendations is realized. In just 3 steps, you can quickly generate a clear decision-making board after the market closes, easily grasp the timing of buying and selling, and make investment decisions more scientific and efficient.

Friends who speculate in stocks, do you have such a feeling: finally finish your work and squeeze out some time, facing the overwhelming K-line, moving average, MACD… Dozens of indicators are crowded in the mobile phone screen, and my eyes are black, and I don’t know where to start.

Even if you find some analysis and comments, they are all jargon such as “shock pattern, sell high and buy low”, and after reading it, I am even more confused.

In fact, there is only one core pain point: I spent a lot of time looking at a bunch of data, and finally I still didn’t know whether to buy or sell tomorrow. But the tools on the market are all “bells and whistles” and can’t grasp the point.

Don’t worry, AI is coming.

Today, let’s use AI to rub a “after-hours AI military division” exclusive to you. It can help you turn complex data into a clear decision-making board. After the market closes, you only need to take 30 seconds to glance at the report it generates to immediately know: “What price range should I buy tomorrow?” What price range to sell? ”

The final report looks like this:

Stop talking nonsense and go straight to the dry goods.

1. The essence of financial data analysis: look for signals of “buy low and sell high” in uncertainty

Before entering the practice of AI, we must first understand what financial data analysis is doing.

To put it bluntly, the underlying logic of investing to make money is four words: buy low and sell high.

What does a product manager need to do?
In the process of a product from scratch, it is not easy to do a good job in the role of product manager, in addition to the well-known writing requirements, writing requirements, writing requirements, there are many things to do. The product manager is not what you think, but will only ask you for trouble, make a request:

View details >

And all complex financial data analysis is essentially answering two questions:

  1. What is the position now? (High or low?) )
  2. Where is it possible in the future? (Will it go up or down?) )

To answer these two questions, traditional financial data analysis usually does it like this:

This process is logically fine, but for us ordinary people, every step is a hurdle:

  • Difficult data acquisition: Various data sources are uneven, and it is not easy to find stable and clean data.
  • Indicator calculation is troublesome: MACD, RSI, BOLL… Each indicator has a complex calculation formula, which is basically impossible to calculate manually, and it is not flexible enough to rely on software.
  • Manual interpretation halo: Faced with a bunch of charts and indicators, golden crosses, death crosses, top divergence, bottom divergence… All kinds of signals are intertwined, and without three or five years of skills, it is easy to get around them.
  • High decision delay: By the time you finally finish analyzing, the best trading time may have been missed.

Fortunately, AI is coming.

The advantage of AI is that it can automate the tedious, repetitive, and time-consuming tasks above, tirelessly helping you process massive data, calculate indicators, and give preliminary analysis conclusions based on the rules you set, 24 hours a day, 7 ×.

Our goal is to use AI to upgrade traditional analysis processes and create an automated workflow from data acquisition to decision-making recommendations.

2. Methodology: Construct a multi-dimensional “TAV” technical analysis framework

Before doing any analysis, there must be a “chapter”.

Throw a bunch of indicators directly at AI, and it will also be blinded. We need a simple, effective analytical framework to guide our AI.

Here, I would like to introduce a “TAV technical analysis framework” that I commonly use.

“TAV” stands for Trend, Momentum/Acceleration, and Volume, respectively. The core idea of this framework is that a healthy buy and sell signal usually requires resonance confirmation in these three dimensions, thereby filtering out most false signals.

1. Trend – What is our general direction?

Problem-solving: Determine whether the current stock price is in an upward, downward or volatile channel. Going with the flow is the first principle of trading.

Core indicator: Moving Average (MA). For example, MA5, MA10, and MA20 form a long position, representing an upward trend in the short term. MA20/MA60 is an important medium to long-term trend support/pressure level.

2. Momentum – Not enough power to rise/fall?

Problem solving: Measuring the “acceleration” of price changes. The trend may be upward, but if the momentum is exhausted, it may face a pullback or reversal.

Core indicators: Relative Strength Index (RSI) and Moving Average Convergence Divergence (MACD). The RSI enters the overbought zone (>70) to warn of the risk of a pullback, while the entry into the oversold zone (<30) indicates a possible rebound. The MACD’s golden cross/death cross provides momentum signals for buying and selling.

3. Volume – Is there a high level of enthusiasm for participation?

Problem-solving: Volume is the “fuel” for price movements. The increase in price increase is a healthy rise, and the contraction in price increase may be a weak rise.

Core indicator: Volume Columns (VOL). In key positions (such as breaking through pressure levels and breaking below support levels), there must be volume cooperation for signals to be more reliable.

Based on the “TAV framework”, a good buy signal is often a combination of “trend upward + momentum initiation + volume amplification”. Conversely, a sell signal could be “uptrend + momentum exhaustion (overbought) + volume anomaly”.

OK, with this framework, we can design more reliable and logical AI prompts, so that AI can become an analyst who understands “TAV”.

3. Actual combat drill: 3 steps to rub your “AI military division behind the plate”

Now, we’re infusing the idea of the “TAV framework” into our AI workflow.

Step 1: Feed the AI data – Get market data

The first step is to get a stable and free data source. Here are two recommended items:

AKshare, a very awesome open source Python financial database, specializes in providing real-time/historical data on domestic stocks, funds, futures and other financial markets, maintained by domestic developers, and the data source stably covers the official and mainstream financial platforms of the Shanghai and Shenzhen Stock Exchanges.

Project address: https://github.com/akfamily/akshare

Chinese Documentation: https://akshare.akfamily.xyz/

AKTools is a tool that uses fastapi to encapsulate AKShare into an HTTP API interface, allowing different platforms to call APIs to obtain financial data

Project address: https://github.com/akfamily/aktools

Because we will also build an automated AI financial data analysis agent through Dify in the future, we will choose the latter.

# Install aktools
pip install aktools
# Command line to get stock data (example: BYD 002594)
aktools stock_zh_a_hist –symbol 002594 –period daily

After installation, by http:// your server address /api/public/stock_zh_a_hist, you will get structured historical market data (JSON format), including the price and volume under the date index, laying the foundation for subsequent calculation of metrics.

The result is shown below:

Step 2: AI Processing and Calculation – Let the AI learn the “TAV” analysis method

After getting the original market data, the most critical step came. We will use the previous “TAV framework” to design a structured prompt to direct the AI to analyze.

The core of this prompt is to directly translate the operation steps of the “TAV framework” into AI task instructions.

# Role: Expert in Python quantitative analysis script generation based on the TAV framework
You are a top Python quantitative engineer and proficient in the “TAV technical analysis framework” (trend, momentum, volume). Your task is to write a high-quality, ready-to-run Python script. The input of the script is a JSON string containing the daily stock market data, and the output is a structured JSON object containing all the calculated results.
## Script Development Requirements
**1. Input Data Format:**   
The script must be able to handle JSON strings in the following format as input. This is a list of multiple dictionaries, each representing a day’s ticker data.   
“`
json   
[
     {“date”:”1991-04-03T00:00:00.000″,”ticker”:”000001″,”open”:49.0,”close”:49.0,”high”:49.0,”lowest”:49.0,”volume”:1,”turnover”:5000.0,”amplitude”:0.0,”change”:22.5,”change”:9.0,”turnover rate”:0.0}, … More data…   
]   
“`
**2. Core Task: Implement the Computational Logic of the TAV Framework in Python Scripts**   
Your script must include the following calculation features, please use standard libraries such as ‘pandas’ and ‘talib’ (if needed) to implement it:   
**Part 1: Trend Analysis:**   
– Calculate **MA5, MA10, MA20**.   
– **[Critical]** Write a function to judge the latest moving average alignment state (return “long position”, “short position”, “bond oscillation”).   
**Part 2: Momentum Analysis:**   
– Calculate the RSI (6th, 12th).   
– Calculate **MACD (DIF, DEA, MACD bars)**.   
– **[Critical]** Write a function to judge the latest MACD status (return “golden cross”, “death cross”, “top divergence”, “bottom divergence”, “high passivation”, etc.).   
**Part 3: Volume Analysis:**   
– Calculate the 5-day average trading volume.   
– **[Critical]** Write a function to judge the latest volume status (return “significant volume”, “moderate volume”, “contraction”, “land volume”).   
**Part 4: Decision Synthesis:**   
– [Critical]** Write functions to calculate the recommended buy and sell range strictly based on the combined signals of the TAV framework.     
– **BuyRange Calculation Logic**: A strong buy signal should meet the following combinations: On the trend, the price retraces key support levels such as MA20; On the momentum, the RSI is at a low level or the MACD forms a golden cross; **Trading volume**, there are signs of stabilization or volume. Based on this, a reasonable potential buying price range is calculated (e.g., with the support level as the lower limit and the support level rising by 3-5% as the upper limit).     
– **SellRange Calculation Logic**: A strong sell signal should meet the following conditions: On the trend, the price touches a key pressure level such as the previous high; On **momentum**, RSI enters the overbought zone or has a top divergence; **Trading volume**, there is a sky or a rise and a contraction. Based on this, a reasonable potential sell/take profit price range is calculated (e.g., the upper limit is the upper limit and the lower limit is 3-5% lower than the pressure level).   
**Part 5: Generating AI Analysis Text (aiAnalysis):**   
– **[Critical]** Write a function to stitch the previously calculated **TAV state text** into a structured analysis conclusion. For example, combine the moving average status, MACD status, and volume status into a paragraph.
**3. Script Output Format:**   
The final output of the script must be a JSON object printed to the console that conforms to the following structure and key name. All values are only the most recent (usually corresponding to the last line of the input data).   
“`
json   
{
     “symbol”: “extract from input data”,     
“name”: “(can be left blank or set to be the same as symbol)”,   
“currentPrice”: “Last Close”,     
“changePercent”: “Latest Change”,     
“sellRange”: {“min”: 0.0, “max”: 0.0},     
“buyRange”: {“min”: 0.0, “max”: 0.0},     
“aiAnalysis”: {       
“trend”: “TAV-trend: [script-generated moving average state]”,       
“momentum”: “TAV-momentum: [RSI/MACD status generated by script]”,       
“volume”: “TAV-Volume: [Script-generated volume status]”,       
“recommendation”: “Synthesis: [Short Suggestions Based on TAV Combined Signal Generation]”,       
“risk”: “Risk warning: [Short risk warning generated based on TAV combination signals]”     
},     
“techData”: {
       “MA5”: 0.0,       
“MA10”: 0.0,       
“MA20”: 0.0,       
“RSI6”: 0.0,       
“RSI12”: 0.0,       
“MACD”: {“dif”: 0.0, “dea”: 0.0, “bar”: 0.0, “status”: “[Script-generated MACD status]”}     
}   
}   
“`
**4. Script Quality Requirements:**   
– **Clear Annotations**: Add necessary notes for key functions and calculation steps.   
– **Code Robust**: Handles boundary cases where possible data is insufficient (e.g., historical data is less than 26 days cannot calculate MACD).   
– **Dependency Description**: Clearly indicate the Python library that needs to be installed (e.g., ‘# requirements: pandas, talib’) with a comment at the beginning of the script.

Please start writing this Python script now.
“`

Pay attention to the design of this prompt, which is the essence of AI-assisted analysis. We didn’t let AI “think” and “analyze” directly – because it was easy to hallucinate.

Instead, we position AI as a top-tier “code generator.” We use the “TAV framework” as a blueprint to direct the AI to write us a complete, executable set of Python calculation scripts.

The benefits are:

  1. The result is verifiable: the logic of the code is clear, and every time it runs, as long as the input is the same, the output must be the same, which not only eliminates AI hallucinations, but also facilitates us to make the Dify workflow in the next stage
  2. Built-in analysis framework: Instead of simply listing requirements, the prompt text has a built-in analysis logic TAV to support the next step of AI analysis conclusions.

In simple terms, we put AI in charge of what it does best, “translating natural language rules into code”, and handing over the most critical “computation and logical execution” to deterministic Python scripts. The result of this is truly “reliable”.

Step 3: Run the script & generate the report – dynamically update the HTML

The whole process can be broken down into two small actions:

Action 1: Run the Python script to generate the analysis result JSON

  1. Prepare the environment: Make sure your computer has Python installed and the libraries required in the script (e.g., pandas, talib).
  2. Get data: Get the latest stock market data through the API we built in Step 1 and save it as a JSON file (e.g. stock_data.json).
  3. Execute Script: Run the AI-generated Python script. This script automatically reads stock_data.json file, performs all the calculations of the TAV framework, and finally prints out the JSON object we need on the screen with all the analysis results.
  4. Save Result: Copy and save the result printed on the screen as a new JSON file (e.g. analysis_result.json).

Cookie Brother’s Tips: We can string together the above series of actions with Dify workflow to achieve one-click automation. In this way, after each close, you only need to tap to automatically complete the entire process from “Get Data” to “Generate Analysis Results”.

Action 2: Use the HTML template “Dynamic” to display the JSON results

With the analysis_result.json file containing the analysis results, the final step is to turn it into a cool, intuitive HTML report like the one we saw at the beginning.

The biggest pitfall of directly allowing AI to generate HTML reports is instability. The report generated today may be perfect, but if you update the data tomorrow, the typography may be messed up.

Therefore, we adopt the strategy of “template + data separation”, which is the key to ensuring stable output.

1. Let AI generate HTML templates first: Let AI act as an “HTML page generation expert for personal stock investment decision-making dashboards”, using BentoGrid layout, GSAP animations and other technologies to design a beautiful web template that does not contain specific data.

Prompt:

# Role: Personal stock investment decision dashboard HTML page generation expert
## Introduction:
– Description: Professional stock investment analyst and HTML dynamic web design expert who specializes in creating personal investment decision support dashboard pages that meet modern design trends and technical requirements.
## Background: You are a senior stock investment consultant and web design expert who specializes in transforming market data, technical indicators, and AI analysis conclusions of personal holdings into visually appealing HTML dynamic web pages. You are familiar with various modern web technologies and design trends, especially BemtoGrid layouts and GSAP animations.
## Objective: Generate a complete, ready-to-use HTML page that showcases investment decision support dashboards for individual holdings. The page should meet all technical and design requirements, clearly and intuitively display stock quotes, technical indicators, and AI analysis conclusions, helping users make scientific buying and selling decisions.
## Technical Requirements:
1. Use the BemtoGrid layout system
2. Integrate GSAP Motion and FramerMotion
3. Developed based on HTML5 and TailwindCSS
4. Responsive design and size font comparison application, which can be adapted to multiple terminals of mobile phones and computers.
## Design Specifications:
1. Choose the appropriate background color and theme tone based on stock characteristics
2. Use oversized fonts and visual elements to highlight key points and create visual contrast
3. Mixed Chinese and English, mainly large fonts, English small characters
4. Use clean rectangular elements for data visualization
5. The highlight color transparency effect is used for the border to avoid different highlight colors covering each other
6. All data charts are styled in footnotes to maintain topical consistency
7. Avoid using emojis as the main icon
## Output Format: Please provide the complete HTML code directly, including all necessary CSS and JavaScript, to ensure that the code can be copied and used directly and function properly.
The code should contain:
1. Complete HTML structure
2. Inline or externally referenced CSS (including TailwindCSS)
3. Necessary JavaScript (including GSAP and FramerMotion)
4. CDN references and other necessary resource links
## Data Entry Instructions: You will get the following three data variables, please refer to and visualize them reasonably in the page:
-**recent_data**: Original stock market data for the past 30 days (JSON string, including date, opening, closing, highest, lowest, volume, turnover, change and other fields)
– **tech_data**: Technical indicators calculated based on recent_data (JSON strings, including common technical indicators such as MA5, MA10, MA20, RSI6, RSI12, MACD, etc.)
– **text**: Stock analysis conclusions generated by the AI large model (strings, including trend judgment, buying and selling suggestions, risk warnings, etc.)
## Initialization:
As an expert in HTML page generation for personal stock investment decision dashboards, I’m ready to create a complete HTML page for you. Please use the above data variables directly to generate usable HTML code for the core goal of the page:
– Core goal: grasp the key status of all stocks held within 5 seconds – Key elements to be included: overview of stock position value, the selling range is [], the buy range is [], and the technical indicators make this judgment.
– Others: The display of stocks that need to be supported by multiple positions (the same stock only needs to be displayed once), highlighted display of AI analysis conclusions, and a dashboard page that is modern, interactive and friendly, and can be directly embedded in the web page

2. Reserve the data interface in the template: In the HTML template, write a small piece of code in JavaScript, the function is to read an external JSON file, and fill the data in the JSON into the specified position of the web page.

Reference Words:

In order to make the subsequent personal stock investment dashboard display with a more stable template (such as the same structure, indicators, etc.), please let the data indicators in the html directly read the data file according to the previous HTML report template template, so that the next time I update the data file, the html can read the latest data. Please give me the HTML template, which is the data file JSON of the stock output example [please add attachment].
Request:
1. This html I opened directly through a modern browser such as Chrome, you need to give me the easiest way to ensure that the data file is read normally
2. I don’t have a server, it’s a pure html file
3. Operation method: Let HTML read JSON data by uploading JSON files, which means that a new function for uploading files needs to be added to HTML.
4. json data file should be kept simple, ensuring that only metrics and corresponding data are stored in the json, as well as related text, and the rest of the complex parts are written in the html template

3. One-click upload, report generation: We only need to make the HTML template support local file upload. After each close, you upload the analysis_result.json file you just generated to this web page, and it will automatically refresh to display the latest decision board.

In the end, you get an “after-hours AI military division” that can be used for a long time and stably.

Further, which we will do in the next stage, we can use the Dify “Template Conversion” node to automatically update the HTML report in the workflow

Reference:

The Ultimate Guide to Dify’s “Template Conversion” Node: Advanced Techniques for Dynamic Text Generation (with Code)

Jinja2 engine analysis|Summary of 6 major application scenarios: the risks and correct posture of AI in financial analysis

Okay, that’s the end of the analysis. But is this AI workflow flawless? Of course not.

The biggest problem with AI in financial analysis is “illusion” and “black box”.

And our workflow is precisely to avoid these two biggest pitfalls.

1. How to avoid the “computational hallucination”?

Wrong posture: Ask the AI directly: “Help me analyze this stock and tell me whether to buy or sell tomorrow?” – AI may be talking nonsense seriously.

Correct Posture: Our strategy is to let the AI generate executable Python code instead of directly providing analysis results.

In the prompt, we use the “TAV framework” to solidify the analysis logic and let AI translate this logic into code. We trust the certainty of the code, not the “creativity” of AI.

2. How do I open the “Decision Black Box”?

Wrong posture: Only look at the final recommendation given by the AI (“buy” or “sell”), but don’t care about the reason.

Correct posture: In addition to the conclusion of “buying and selling range”, in our report, it is necessary to clearly show “aiAnalysis” (judgment of each dimension of TAV) and “techData” (values of core indicators such as MA, RSI, MACD, etc.).

In this way, we can quickly verify whether the AI’s suggestions match the TAV analysis logic we have set, and truly know what we know.

In an era where AI is readily available, it is no longer enough to simply ask questions with AI. Being able to design and build an automated system that can continuously solve our real pain points and transform them into efficient decision-making power may be the real “superpower” that each of us ordinary people can master.

End of text
 0