You can accomplish this task using Azure Data Factory Data Flow.
You will have to install self hosted integration runtime in your local machine.
Create a new linked service, use the above IR created, provide your local SQL server and DB name and that should connect it
Follow: https://www.mssqltips.com/sqlservertip/5812/connect-to-onpremises-data-in-azure-data-factory-with-the-selfhosted-integration-runtime--part-1/
Similarly, you need to create Linked Service for Azure SQL DB using default integration runtime available in ADF. Follow: https://www.tallan.com/blog/2021/03/30/how-to-create-a-sql-database-linked-service-in-azure-data-factory/
Once you created both Linked Service, you need to create Datasets using these Linked Service. One dataset will refer to the source table at local machine and other will be used as sink in Azure SQL database.
To create dataset in ADF, follow Author (pencil symbol on left side) -> Dataset -> New Dataset.
Now the main thing comes, ADF Data Flow.
To create data flow: Author (pencil symbol on left side) -> Data Flow -> New Data Flow.
Your data flow will look like shown below. In the Source Setting you need to select input dataset you have created in above steps, where your input data is stored.

You can enable the data flow debug as shown in above image to check the data preview after transformation in each step, as shown below. We will transform column FirstName as per the requirement.

In DerivedColumn, we will replace the space with empty value using replace(FirstName, " ", "") function.

In Alter Row, we will delete the rows based on some condition. Here we are deleting rows where PersonID > 8.

Once done, you can see the resulting data in Sink Data Preview.

Finally, simply create a new pipeline. Drag the data flow activity in white space. Select the data flow activity name we created above in Settings. Click on Add trigger to Trigger Now.

Once pipeline finished, you can check the data in your Azure SQL DB.