- Behave - Home
- Behave - Introduction
- Behave - Installation
- Behave - Command Line
- Behave - Configuration Files
- Behave - Feature Testing Setup
- Behave - Gherkin Keywords
- Behave - Feature Files
- Behave - Step Implementations
- Behave - First Steps
- Behave - Supported Languages
- Behave - Step Parameters
- Behave - Scenario Outlines
- Behave - Multiline Text
- Behave - Setup Table
- Behave - Steps in a Step
- Behave - Background
- Behave - Data Types
- Behave - Tags
- Behave - Enumeration
- Behave - Step Matchers
- Behave - Regular Expressions
- Behave - Optional Part
- Behave - Multi-Methods
- Behave - Step Functions
- Behave - Step Parameters
- Behave - Runner Script
- Behave - Exclude Tests
- Behave - Retry Mechanism
- Behave - Reports
- Behave - Hooks
- Behave - Debugging
Behave Useful Resources
Behave - Debugging
Behave scripts can be debugged by dry running the test steps. The dry run helps to go over all the test steps without actually running it. It helps to determine the un-defined steps in the step definition file.
It verifies if there are any missing import statements, syntax errors, and so on. All these issues get detected in a quick time by dry run. If we are doing mass updates or any configuration changes, dry running helps to detect any errors in a short time.
If we would have to run an entire suite for debugging, that would be time consuming. In Behave, we can do a debug by dry run with the help of the below mentioned command −
behave --no-capture --dry-run
You will get the screen as shown below −
(myenv) D:\behave\myenv\pythonProject>behave --no-capture --dry-run
USING RUNNER: behave.runner:Runner
Feature: Payment Process # features/payment.feature:1
Scenario: Verify transactions # features/payment.feature:2
Given user makes a payment of 100 INR # features/steps/stepImpPayment.py:9
Feature: Administration Process # features/payment1.feature:1
Scenario: Verify admin transactions # features/payment1.feature:2
Given user is on admin screen # features/steps/stepImpPayment.py:13
0 features passed, 0 failed, 0 skipped, 2 untested
0 scenarios passed, 0 failed, 0 skipped, 2 untested
0 steps passed, 0 failed, 0 skipped, 2 untested
Took 0min 0.000s
The output shows 2 untested which shows the count of the test steps.
The output clearly defines the un-defined steps in the step definition file obtained by dry run.