0

so I have this feature that I am working on that takes an object (Over 10k items) picks an item from this object, sends it to an api, which process it then give a response, then proceeds to the next item.

Currently using the async library, the mapLimit method and it works as aspected.

But my problem is that it’s takes too long to loop through the entire dataset cause of the length.

This feature is suppose to be a continual process, once the entire object is iterated, wait for few seconds then do the same thing again.

I tried forking a child_process for this, broke the objects into chunks and created a process for each chunks till it was completed. This worked well as intended but the memory consumption was massive other processes on the server failed as a result of lack of available memory, even after exiting the process after it was completed.

Please how do I achieve this at a faster rate?

I use this to get the list of wallets.

getListofWallet = async () => {

        try {

            const USDT = await usdt.query(sql`
                    SELECT * FROM USDT ORDER BY id DESC;
                `);

           

            let counter = 0;
            let completion = 0;

            async.mapLimit(USDT, 6, async (user) => {

                let userDetail = {
                    email: user.email,
                    id: user.user_id,
                    address: user.address
                }

                

                try {
                    await this.getWalletTransaction(userDetail);
                    completion++;
                } catch (TronGridException) {
                    completion++;
                    console.log(":: A TronGrid Exception Occured");
                    console.log(TronGridException);
                }

                if (USDT.length == completion || USDT.length == (completion-5)) {
                    setTimeout(() => {
                        this.getListofWallet();
                    }, 60000);
                    console.log('~~~~~~~ Finished Wallet Batch ~~~~~~~~~');
                }


            }
            );

            
           
        } catch (error) {
            console.log(error);
            console.log('~~~~~~~Restarting TronWallet File after Crash ~~~~~~~~~');
            this.getListofWallet();
        }
    }

Then I use this to process to data sent and perform the neccessary action.

getWalletTransaction = async (walletDetail) => {

        const config = {
            headers: {
                'TRON-PRO-API-KEY': process.env.api_key,
                'Content-Type': 'application/json'
            }
        };

        const getTransactionFromAddress = await axios.get(`https://api.trongrid.io/v1/accounts/${walletDetail.address}/transactions/trc20`, config);

        const response = getTransactionFromAddress.data;

        const currentTimeInMillisecond = 1642668127000; //1632409548000


        response.data.forEach(async (value) => {



            if (value.block_timestamp >= currentTimeInMillisecond && value.token_info.address == "TR7NHqjeKQxGTCi8q8ZY4pL8otSzgjLj6t") {

                let doesHashExist = await transactionCollection.query(sql`SELECT * FROM transaction_collection WHERE txhash=${value.transaction_id};`);

                if (doesHashExist.length == 0) {

                    if (walletDetail.address == value.to) {

                        

                        const checkExistence = await CryptoTransactions2.query(sql`
                             SELECT * FROM CryptoTransaction2 WHERE txHash=${value.transaction_id};
                        `);

                        if (checkExistence.length == 0) {

                            
                                const xCollection = {
                                    collection: "CryptoTransaction2",
                                    queryObject: {
                                        currency: "USDT",
                                        sender: ObjectID("60358d21ec2b4b33e2fcd62e"),
                                        receiver: ObjectID(walletDetail.id),
                                        amount: parseFloat(tronWeb.fromSun(value.value)),
                                        txHash: value.transaction_id,
                                        description: "New wallet Deposit " + "60358d21ec2b4b33e2fcd62e" + " into " + value.to,
                                        category: "deposit",
                                        createdAt: new Date(),
                                        updatedAt: new Date(),
                                    },
                                };

                                await new MongoDbService().createTransaction(xCollection);

                                //create record inside cryptotransactions db.
                                await CryptoTransactions2.query(sql`INSERT INTO CryptoTransaction2 (txHash) VALUES (${value.transaction_id})`)

 
        });
    }
6
  • It's unlikely anyone can provide much useful help without seeing the relevant portions of your actual code to be able to see the details of what you've implemented. Questions about code must include the relevant code in the question. Otherwise, all we can do is make wild guesses which is rarely the best way to do things and is always less efficient for all. Commented Oct 7, 2022 at 23:42
  • My apologies, I have updated the question with the necessary code base. Commented Oct 8, 2022 at 0:34
  • What's the point of a 60 second setTimeout() in your code? Have you profiled or measured in any way where most of the long duration of time is being spent? Commented Oct 8, 2022 at 2:41
  • The set timeout is to hold on a for a moment before the whole process is triggered again. Yes I did, I believe the time it takes to go through like 10k records is what is making it slow. If there’s a way I can complete the entire iteration in 5 minutes or less, that will solve my problem because this operation is time sensitive. Commented Oct 8, 2022 at 4:50
  • I doubt anyone here can help you with performance suggestions without having either detailed profiling data on how much of the time each step is taking and/or having something we can run and benchmark ourselves. Commented Oct 8, 2022 at 17:57

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.