3 Local Tutorial Debugging using Spark logs
Dinesh Chandnani редактировал(а) эту страницу 2019-04-15 18:21:30 -07:00

If you run into issues with your job, you will want to dig into the job logs to determine the cause. The Spark jobs has detailed logs; you can access them via the docker logs command.

In this tutorial, you'll learn to:

  • View docker logs

Docker logs

  • You can look at the logs by querying the container that is running via Powershell
    • Launch Powershell then run the following command:
      docker logs --tail 1000 dataxlocal
      
    • If you want to see the logs continuously be updated, you can use the '-f' flag:
      docker logs -f --tail 1000 dataxlocal
      

This will help diagnose issues, see exceptions and callstacks and confirm jobs are running properly.

Other Links