Development Process at NODA

At NODA, our roots in Software Engineering and Computer Science shape our approach to software development. As a company committed to excellence, we ensure that our development processes reflect our dedication to quality, collaboration, and innovation.

Collaboration and Transparency

We believe that transparency and collaboration are key to successful project delivery. Our development process is designed to be open and inclusive, ensuring that all stakeholders are kept informed and involved. By employing an iterative, community-like process, we maintain flexibility and adaptability, ensuring that our solutions meet the evolving landscape.

Embracing Free and Open Source Software

As a small company, we deeply appreciate the contributions of the Free Software and Open Source communities. While it is not always possible for us to contribute back directly, we encourage our employees to participate in and contribute to open source projects whenever possible. This not only helps the community but also enhances our team's skills and knowledge.

Development Process


Planning is the first step required for any development process. Given that most projects are based on a delivery date, we start with that and work backwards to ensure a comprehensive plan. This involves establishing:

  • Maintenance Period: Determining the duration for which the software will be maintained post-deployment.

  • Project Deadline: Setting the final delivery date for the project.

  • Testing Period: Allocating time for comprehensive testing to ensure the solution meets all requirements and quality standards.

  • Development Period: Scheduling the necessary time for the actual development work, including coding and initial testing.

  • Requirement Analysis Period: Defining the time needed for gathering, analyzing, and finalizing project requirements.

  • Resource Allocation: Identifying and assigning the necessary resources for both the development and maintenance phases.

For projects or solutions with ongoing development and evolving requirements, each addition of new features should be treated as a separate small or large project depending on the scope.

While this process may resemble the traditional Waterfall methodology, and in many respects it functions similarly, it is designed to ensure thoroughness before progressing to the next stage. This approach minimizes the risk of investing time and resources in work that may later prove unnecessary. However, unlike the rigid structure of Waterfall, our process is flexible and encourages back-stepping when necessary. This adaptability allows us to revisit and revise previous stages to ensure we are always working on the right tasks and meeting the project’s evolving needs.

Call it what you will; we simply don't put a label on it.

Requirement Analysis

At the outset, we focus on thoroughly understanding the needs and defining the project requirements. This involves:

  • Conducting requirement elicitation meetings with key personnel and stakeholders.

  • Performing research to determine the feasibility of the requirements.

  • Developing a project-specific Requirement Specification.

  • Engaging in negotiations to determine which requirements can be addressed and which cannot, often due to time constraints and resource availability.

The initial phase of requirement analysis concludes once all relevant parties have agreed on the specifications and the interpretation of each requirement.


Certain requirements may involve concepts or dependencies that fall outside the immediate expertise of the development team. To accurately assess the feasibility of these requirements, prototyping becomes essential. Prototyping allows us to explore, experiment, and gain the necessary knowledge to make informed decisions.

For example:

  • Integrating New Technologies: When a requirement involves using a new or unfamiliar technology, we create a prototype to test its compatibility and performance within our solution.

  • Complex User Interfaces: For most user interface designs, we develop prototypes to validate usability, functionality, and user experience.

  • System Interoperability: When the project requires integration with external systems or APIs, we prototype these connections to ensure smooth and reliable interactions.

  • Performance Testing: Prototypes can be used to test the system's performance under various conditions, identifying potential bottlenecks and optimization opportunities.

Prototyping helps us mitigate risks, validate assumptions, and refine our approach before full-scale development begins, ensuring a more efficient and effective development process.


Based on the requirements outlined in the Requirement Specification, we design the system architecture, user interfaces, and other key components with security as a primary focus.

During the design phase, new requirements or adjustments to existing requirements may arise. As a result, we must remain flexible and revisit previous steps in the process to accommodate these necessary changes, often requiring a new iteration of Requirement Analysis.

Throughout the design phase, we will often produce:

  • System Architecture Diagrams: Detailed representations of the system's structure, including modules, data flow, and integration points.

  • User Interface Mockups: Visual prototypes of the user interfaces to ensure alignment with user experience goals and client expectations.

  • Database Schemas: Structured plans for data storage, ensuring efficient data management and retrieval.

  • Security Models: Plans addressing security measures to protect data integrity and privacy.

  • Technical Specifications: Documentation outlining the functionalities and interactions of each component, providing a clear blueprint for the development team.

Certainly! Here is the adjusted and extended version of the Testing chapter:


The concept of testing should precede the implementation phase. Even though there is nothing to test before code is written, sufficient information should be available for developers to prioritize testing from the outset. This approach shifts the traditional mindset of implementing first and testing later to a more proactive strategy of considering how to test functionality before implementing it, ensuring that the implementation is inherently testable.

Testing is conducted throughout the entire development phase and extends beyond it until we are satisfied with the quality assurance.

Artifacts from testing often include:

  • Unit Tests: Comprehensive tests for individual components, with a coverage target where less than 90% is unacceptable. These tests ensure each part of the codebase functions as intended.

  • Integration Tests: Tests to ensure that different modules and components work together seamlessly.

  • System Tests: End-to-end testing of the complete system to verify that it meets all specified requirements.

  • Performance Tests: Assessments to ensure the system performs well under various conditions and loads.


During the development phase, code is written with testability and security as primary core pillars. Features are developed in separate branches, and once a feature is completed and passes all its tests, a merge request is created. A reviewer then examines the merge request, and if adjustments are necessary, the developer must make the modifications before the updated merge request is reviewed again. Once accepted, the code is merged into the main codebase. Each change is tracked using a versioning system (Git), allowing any state of the source code to be investigated.

If developers encounter issues that require changes to the Design or necessitate negotiations about the Requirements, these issues should be promptly raised with the responsible parties for each phase to find a solution.


Once all features, or a limited set of features due to time constraints, have been merged into the main branch, a code freeze is implemented. During a freeze, no new features may be merged, and the current state of the main branch is assigned a tag. Development can continue in other branches, but the main branch is off-limits to ensure stability and focus on finalizing the current release. This period is used for final testing, bug fixes, and preparation for the release.

Only critical bugs may result in a slight thaw, where patches are merged to address these issues. Otherwise, the freeze remains in effect, ensuring the integrity and stability of the main branch until the release is complete.

During this time, the deployment team is notified about the upcoming release so they can schedule and prepare for the necessary work to deploy the solution into production.


Before deploying the solution, it undergoes testing in a staging environment. During this phase, a period of soak testing is conducted, where the system is used as if it were in a production environment. The duration of the soak testing period varies depending on the specific circumstances, and it is up to the testing team to determine the appropriate length of time required on a case by case basis.


Once the testing team approves the contribution, the tagged release is assigned an appropriate version number and a build is pushed to the registry in the production environment. During the scheduled maintenance window, the deployment team pushes the update to the production system and monitors it for any unusual behavior for the next few hours. If any issues arise, a rollback to the previous version is performed.


Once the deployment team completes their work, the maintenance period begins. During this phase, automated logging mechanisms are in place to detect and report any unexpected behavior. These systems continuously monitor the application, ensuring that any issues are promptly identified and addressed.

Additionally, regular updates and patches are applied to maintain the system’s security and performance.

Software Lifecycle

We follow a structured process to ensure smooth software deployment. This process guides us through code commits, testing, deployment, and potential rollback, ensuring reliability and consistency throughout.

graph TD;
	ReportIssue[Report Issue]

    Start([Start]) --> A[Planning]
    A --> B[Requirement Analysis]
    B --> C[Prototyping]
    C --> D[Design]
    D --> E[Development]
    E --> F{Run Tests}
    F -->|Pass| G[Code Freeze]
    G --> Staging[Staging]
    Staging --> I[Tag Release]
    Staging -->|Major Issue|ReportIssue
    I --> J[Manual Building\nof Container]
    J --> K{Deploy to\nProduction}
    K -->|Success| L([Success])
    K -->|Fail| M[Revert to Previous\nVersion]
    M --> ReportIssue --> E
    F -->|Fail|E
    F -->|Major Issue|D
    D -->|Major Issue|B