Google Summer of Code 2018 Work Product Submission



coala

Vaibhav Kumar Rai

I am third year student of B.Tech Computer Science Engineering at Shri Mata Vaishno Devi University, Katra, Jammu and Kashmir. I participated in GSoC and worked with coala to create better quality Bears by creating a tool which has the ability to jump directly into Bear code and move through it using a debugger interface such as pdb and make debugging of Bears as easy as writing the Bear. A Debugger also has a ability to investigate the settings passed to the Bear. I also implemented a Profiler Interface which provides abilities to profile Bear code to optimize its performance.


Patches Tarball


SHA-256:

d61862f8f8be5ec168b07741c09781ef2dcc991e44d87edc05e6d237044052b0

Bonding

Phase 1

Phase 2

Phase 3


Links to commits and repositories I've worked on:

Repository Link to Commit/s Description
c  cEPs View

cEP-0021.md: Proposed implementation of Profiling Interface.

c  coala View

Removed the filter_raising_callables function and inline it, to decrease the complexity of code for future modification in Debugger
Interface pull request.

c  coala View

Added Debugger Interface with --debug-bears argument and debug bears process so that pdb work in single process environment and
bear developer can debug specified bear.

c  coala View

Debug_Bears.rst: Added Debugger Interface document with demo.

c  coala View

Prevent bear test from failing, replaced httpstat with Google teapot because of SSL certification error.

c  coala-bears View

Debugged PySafetyBear to get the cause of upstream problem and failure of appveyor CI, updated PySafetyBear to support lastest version
of safety.

c  coala View

Added Settings inspection toolset through which user can access the settings of a Bear in Debug environment.

c  coala View

Debug_bears.rst: Added Settings inspection toolset document with demo.

c  coala View

Removed debugger flag from init of bear Base class because of the API breakage issue for the existing users.

c  coala-utils View

Enhance TRUE and FALSE STRINGS in coala-utils constants so that settings like debugger and profiler can be enable and disable with
the value like ok, positive, none, negative etc.

c  coala View

Added Profiler Interface on coala with --profile argument to profile execution of bear.

m  mobans View

Add prof/ to coala.gitignore template. Ignored profiled data directory created by pytest-profiling.

c  coala View

Profile_Bears.rst: Add Profiling Interface document with demo.

p  pytest-plugins View

pytest_profiling.py: Add additional argument to configurable profile output directory. (Not merged; approved by GSoC admin)


Debug and Profile Bears

Work Done

  1. Debugger Interface added in coala.
    1. Step into the run() method of a bear move through it using a debugger interface such as pdb and exits as soon as run() is left.
    2. Pdb’s quit-command (q) remapped so coala continues its normal execution without aborting. So, quit or q will first clear all breakpoints and then continue execution.
    3. Users can specify the bear they want to debug using --debug-bears.
    4. Users can specify to debug bears using a .coafile.
  2. A new command settings is included in coala’s Debugging Interface to inspect Bear settings in the debugging environment. It displays all passed settings of a Bear and their values so bear developer can quickly inspect them.

  3. Profiler Interface added in coala.
    1. The profiler will start by profiling the run() method of bears because this is the part where bear writers will spend time on, as rest of the part like loading the files, collecting the settings, etc. are done by coala itself.
    2. --profile-bears or profile_bears (using .coafile) is the main argument to enable profiling.
    3. Accept an additional parameter directory_path through which bear developers can specify where to store the profiled data files.

Challenges

I had difficulties with testing of Debugger interface. With the help of Mock, test for debugger interface was created, though code coverage was 100% but mock test was not the ideal test, so after hours of discussion with Mischa we decided to capture the pdb stdout and assert the the output with the real output but problem arised, the ideal tests passes but somehow it didn’t showed the code coverage to 100% not only that it also removed the coverage of the previous tests, at first I thought created test have some problem but seems like pytest-cov had a problem, so to confirm that I created a test repo and enable the CI on that repo to check the coverage report, it also showed same failure. So, I created a issue on pytest-cov repo but actual problem was, both coverage and debugger use the same trace feature, so it was not possible to run coverage and debugger in parallel or say it is not possible to chain the trace functions, so Mischa gave a idea to restore the trace callbacks and then run test, it worked finally so I included both ideal and Mock test so that the coverage remain 100, but in all of these I lagged around 3 week from my schedule. So I couldn’t implement one of the proposed feature.

Work to be done

A tool which will provide the ability to inspect result instances like origin, diffs, file, severity, message, aspects and several other attributes in debugging environment or in a seperate python console where all where user can easily access there values. Information about result will help bear developer in creating a better performant Bears.

Result inspection tool will not only help in debugging the bear results but also help in testing the bear by asserting the various attributes of expected result with actual result.