2

I have this code: I need a table with Volume stock always updated. The script doesn´t work, the page keep saying "loading", how to improve it? i want create a page with different data stock live.

import dash
import dash_core_components as dcc
import dash_html_components as html
from dash.dependencies import Input, Output
import dash_bootstrap_components as dbc
import pandas as pd
import yahoo_fin.stock_info as si
import requests

external_stylesheets = ['https://codepen.io/chriddyp/pen/bWLwgP.css']

app = dash.Dash(__name__, external_stylesheets=external_stylesheets)
app.layout = html.Div(
    html.Div([
        html.H4('Volume Feed'),
        html.Div(id='live-update-text'),
        #dcc.Graph(id='live-update-graph'),
        dcc.Interval(
            id='interval-component',
            interval=1*100, # in milliseconds
            n_intervals=3
        )
    ])
)


@app.callback(Output('live-update-text', 'children'),
              Input('interval-component', 'n_intervals'))
def update_metrics(n):
    QuoteTable = si.get_quote_table("aapl")
    Volume=QuoteTable["Volume"]
    row1 = html.Tr([html.Td("Previous Close",style={'font-weight':'bold'}), html.Td(Volume)])
    table_body = [html.Tbody([row1])]

    style = {'padding': '5px', 'fontSize': '16px'}
    return [
        dbc.Table(table_body,bordered=True,hover=True,responsive=True,striped=True,)
    ]



if __name__ == '__main__':
    app.run_server()

Any suggestion?

1
  • You could try breaking the code into sections, and running each, to see where the bottleneck is Commented Nov 25, 2020 at 19:53

1 Answer 1

0

Your interval is 100ms, and on each interval page reloads so you have no opportunity to see anything. You can change to something like 2000ms or more and it might do it. You might run into problems if the site would have more components however.

interval=2*1000

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.