Mais conteúdo relacionado

Development Guideline

  1. SDLC
  2. Requirement Analysis
  3. What is Requirement Analysis Requirements analysis is the process of determining user expectations for a new or modified product. The success and failure of a software project is mostly depending on requirement analysis stage. The requirements should be documented, actionable, measurable, testable, traceable, related to identified business needs or opportunities.
  4. Requirements gathering problem  Customers don't (really) know what they want • Ensure to spend sufficient time • Make visible any assumptions • Make your customer read, think and sign off the requirements  Requirements change during the course of the project • Have a clear defined process for receiving any change requests • Set milestones for each development phase beyond which certain changes are not permissible. • Ensure that change requests and approvals are clearly communicated to all stakeholders
  5. Requirements Gathering Problem  Customers have unreasonable timelines • Convert the software requirements into a project plan. • Ensure that the project plan considers available resource  Communication gaps exist between customers, engineers and project managers • Take notes at every meeting and distribute these throughout the team members.
  6. Fact Finding Techniques Interview: • In this techniques analyst sits face to face with the people and record their responses. • Besides analyst can ask the people about their current problem and their requirement from postposed system. • Remember to follow the bellow rules while conducting interview: • Determine the people to interview. • Establish objectives for the interview. • Develop interview questions. • Prepare for the interview. • Conduct the interview. • Document the interview. • Evaluate the interview
  7. Fact Finding Techniques Questionnaire: • The questionnaire consists of series of questions framed together in logical manner. The questions should simple, clear and to the point. Questions can be of two types: Free-format questions: In these types of questionnaires, users are allowed to answer questions freely without immediate response. Examples are: • Is this feature a process? • When will this feature be used? • Who will deliver the inputs for the feature? • Who will receive the outputs of the feature? • What is the end result of doing this? Fixed-format questions: - It’s a predefined fixed-format questionnaires - Users are allowed to choose the result from the given answers - Question includes multiple-choice questions (Yes or No type), rating questions (Strongly agree, Agree, Disagree, Strongly disagree), ranking questions - Examples are: - Do you feel that the reports are useful? Ο Yes Ο No - Are the reports are useful for your jobs? Ο Yes Ο No - If your answer is No, please explain WHY. Ο Yes Ο No
  8. Fact Finding Techniques • Form to collect requirements for Analysts
  9. Fact Finding Techniques Observation: • In this technique analyst visits the organization • observe and understand the working of the existing system User Story • description consisting of one or more sentences • Use natural language in description • The typical user story should follow the bellow template: “As a <type of user>, I want <some goal> so that <some reason>”. The user stories are useful to express three important questions: • Who are we building it for, who the user is? — As a <type of user> • What are we building, what is the intention? — I want <some goal or objective > • Why are we building it, what value it brings for the user? — So that <benefit, value>
  10. Fact Finding Techniques Brainstorming: • A technique to generate new ideas and find the solutions to a specific issue • It is conducted with several team members and stakeholders • Members from different departments and domain experts are included to form this team. • The team should then discuss the problem and comes out with a solution in combine Document Analysis: • It involves analyzing and gathering information from existing documents and other related information. • It includes analyzing documents in the form of design documents, templates and manuals of existing systems. • This technique is adapted when existing system needs to be replaced or enhanced
  11. Traceability Matrix (TM) • It’s a table used to trace requirements • Each requirement in the RTM document is linked with its associated test case so that testing can be done as per the mentioned requirements. • Bug ID is also included and linked with its associated requirements and test case. • The main goals for this matrix are:  Make sure the software is developed as per the mentioned requirements.  Helps in finding the root cause of any bug.  Helps in tracing the developed documents during different phases of SDLC.
  12. Traceability Matrix (TM)
  13. Requirement Specification (SRS)
  14. What is SRS • Completely describes what the proposed software should do without describing how software will do it. • It is usually signed off at the end of requirements engineering phase. • Software designer will take SRS to design software to specify how the software will do it
  15. Qualities of SRS • Correct • Unambiguous • Complete • Consistent • Ranked for importance and/or stability • Verifiable • Modifiable • Traceable
  16. Scope • Identify the software product(s) to be produced by name • Explain what the software product(s) will, and, if necessary, will not do • Describe the application of the software being specified, including relevant benefits, objectives, and goals
  17. References • Provide a complete list of all documents referenced elsewhere in the SRS • Identify each document by title, report number (if applicable), date, and publishing organization • Specify the sources from which the references can be obtained.
  18. Problem Domain • A Context Diagram for depicting the current system under consideration as a single high-level process • It shows the relationships between system and external entities • A Context Diagram provides no information about the timing, sequencing, or synchronization of processes (which processes occur in sequence or in parallel).
  19. Current System Description • This section contains a set of use cases that should describe the current system behaviors. • A typical use case includes these sections, usually laid out in a table: Use Case Element Description Use Case Number ID to represent your use case Use Case Name The name of your use case, keep it short and sweet Use Case Description Elaborate more on the name, in paragraph form. Primary Actor Who is the main actor that this use case represents Precondition What preconditions must be met before this use case can start Post condition The state of the software after the basic course of events is complete Trigger What event triggers this use case
  20. Proposed System Requirements Functional Requirements: • define the internal workings of the software like the calculations, technical details, data manipulation and processing. • Each functionality should describe with Use Case diagram with detail description. • The behavior that is to be implemented should be described in plain English Non-Functional Requirements: • include the requirements concerning software design or implementation • Performance – for example Response Time • Reliability • Maintainability • Security • Data Integrity • Usability
  21. Constraints • Tools to be used • Software interface • Hardware interface • Development constraints • Development environments • Deployment environments
  22. Software System Design
  23. What is Software Design • Software design is a process to transform user requirements into some suitable form, which helps the programmer in software coding and implementation • It concentrates on problem domain to turn it a solution domain. • The sole purpose is to specify how to fulfill the requirements mentioned in SRS.
  24. Architectural Model • Service Oriented Architecture (SOA) • 3-Tier Architecture • N-Layer Architecture
  25. SOA • SOA is an evolution of distributed computing based on the request/reply design paradigm. • In this process an application's business logic or individual functions are modularized and presented as services for consumer/client applications.
  26. 3-Tier Architecture • Tier means unit of deployment • A 3-tier application is an application program that is organized into three major parts, each of which is distributed to a different place or places in a network. The three parts are: • The workstation or presentation interface • The business logic • The database and programming related to managing it
  27. N-Layer Architecture • It focuses on the grouping of related functionality within an application into distinct layers. • In every business software there are FOUR Layer exits, which are Presentation Layer, Business Logic Layer, Data Access Layer and Data Storage Layer. • These layers (mainly Business and Data Access Layer) should be broken down with several more layers using Class Library Project.
  28. Design Pattern • Repository Pattern: • In many applications, the business logic accesses data from data stores such as databases or Web services. Directly accessing the data can result in the following: • Duplicated code • A higher potential for programming errors • An inability to easily test the business logic in isolation from external dependencies • Use a repository to separate the logic that retrieves the data and maps it to the entity model from the business logic. • The repository intermediates between the data source layer and the business layers of the application. This will achieve THREE benefits: • It centralizes the data logic or Web service access logic. • It provides a substitution point for the unit tests.
  29. Design Pattern • Unit of Work Pattern : • A Unit of Work can be defined as a collection of operations that succeed or fail as a single unit. • Unit of Work pattern ensure that none of the operations cause side-effects if any one of them fails. • It maintains lists of business objects in- memory which have been changed (inserted, updated, or deleted) during a transaction. • Once the transaction is completed, all these updates are sent as one big unit of work to be persisted physically in a database in one go.
  30. Domain Model • Class Diagram: • To visualize the domain, model a Class Diagram, have to develop. • Class diagram is basically a graphical representation of the static view of the system and represents different aspects of the application.
  31. Database Design • Data Dictionary • Data Dictionary is a collection of descriptions of the data objects or items in a data model. • Helpful to the programmer to know what data they have to dealt with. • Data Dictionary may contain following columns: • Data Field • Data Type • Length • Null ability • Default Value • Constraint • Description • Purpose
  32. UX Design • User Experience (UX) and User Interface (UI) are some of the most confused and misused terms in our field. A UI without UX is like a painter slapping paint onto canvas without thought • A great product experience starts with UX followed by UI. Both are essential for the product’s success.
  33. UX Design • There should be “Master Page” concept when designing html pages or forms. • There should be two “Master Page” in application layer, one for functional pages and another for non-functional pages. • Functional pages should implement the following master page layout. • Non-functional pages should implement the following master page layout.
  34. Coding
  35. Coding • Use Pascal Case in Class naming in .NET Technology • Use Pascal Case in Method naming in .NET Technology • Use Camel Case in Variables naming in .NET Technology • Use Pascal Case in Class naming in Java Technology • Use Camel Case in Method naming in Java Technology • Use Camel Case in Variables naming in Java Technology
  36. Control Naming Guideline
  37. Testing
  38. Testing • Testing Method: Two types are: • Automated Testing: Automated software testing is a process in which software tools execute pre-scripted tests on a software application before it is released into production. • Manual Testing: Manual testing is the process of manually testing software for defects. It requires a tester to play the role of an end user and use most of all features of the application to ensure correct behavior. • White-box Approach: • It focuses on the internal mechanism of a software system to ensure that all code statements and logical paths are tested. • testers have to have full knowledge of the internal workings or logics. • Black-box Approach: • It ignores the internal mechanism and solely focuses on the outputs generated in response to selected inputs. • testers do not need to know the internal workings or logic of the software systems
  39. Testing Types • Unit Testing • is the process of testing the individual subprograms, subroutines, classes, or procedures in a program. • the goal here is to find discrepancy between the programs and the program specifications • Programmers to perform Unit Testing after coding is completed. • Testing Approach: White Box testing. • Integration Testing • The goal is to combine individual software modules and test that module as a group to identify the faults in the interaction between integrated units. • Test Group to perform Integration Testing; and upon fault found they will issue Test Incident Reports to system analysts/programmers, who would fix up the liable errors. • Testing approach: Black & White-box testing
  40. Testing Types • Functional Testing • The goal of functional testing is to verify that a software application performs and functions correctly according to design specifications. • Functional testing is a way of checking software to ensure that it has all the required functionality that's specified within its functional requirements. • Test Group to perform Functional Testing; and upon fault found they will issue test incident reports to system analysts/programmers, who would fix up • Testing Approach: Black-box testing.
  41. Testing Types • System Testing • Types of testing where a complete and integrated software is tested to identify the errors in software. • The goal is to check the behavior of a complete and fully integrated software based on the software requirements specification (SRS) document. • The main focus of this testing is to evaluate Business/Functional/End-user requirements. • Test Group to perform system testing; and upon fault found they will issue test incident reports to system analysts/programmers, who would fix the problem • Testing Approach: Black-box testing. • Some system testing are: • Usability Testing • UI Testing • Performance Testing • Load Testing • Security Testing
  42. Testing Types • Acceptance Testing • Acceptance Testing is the process of comparing the application system to its specified requirements and the current needs of its end users. The goal is to determine whether the software end product is acceptable or not by its end users. • User representatives to perform Acceptance Testing • Testing Approach: Black-box testing. • Acceptance testing are: • Alpha testing: testing is taken place at the development environment by some end users or user representatives. • Beta testing: The testing is taken place at the user’s testing environment by users.
  43. Testing Documentation • Test Plan • Test Type: what types (e.g. functional, system test) of testing is planning to conduct. • Test Item: List the functional items and software features to be tested. • Estimation: For each test item, list the estimated effort required and duration • Test Specification • Testing Environment • Test Termination Criteria • Test Cases • Test Case Id • Test Title • Priority • Test Steps • Test Data • Expected Result • Actual Result • Pass/Fail
  44. Test Cases • Test cases involve a set of steps, conditions, and inputs that can be used while performing testing tasks.
  45. Test Incident Report • The goal is to document any event that occurs during the test process which requires investigation.
  46. Test Progress Report • In order to show the progress of the test procedure, a test progress report should be prepared by the test group.