1 / 15

SAS Software Development with the V-Model

SAS Software Development with the V-Model. Andrew Ratcliffe RTSL.eu Coders’ Corner Paper 124-2011. Overview. Best Practice in SAS software development Process… Flow from: requirements to deployment Via: design, build and test Coding specifics A tip for testing your code

bran
Download Presentation

SAS Software Development with the V-Model

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SAS Software Development with the V-Model Andrew Ratcliffe RTSL.eu Coders’ Corner Paper 124-2011

  2. Overview • Best Practice in SAS software development • Process… • Flow from: requirements to deployment • Via: design, build and test • Coding specifics • A tip for testing your code • Andrew Ratcliffe • First used SAS in 1983 • Provide services through RTSL.eu • Blogging on NOTECOLON.INFO • SAS and software development best practice

  3. Best Practice • Always driven and guided by business purpose • Repeatable set of steps • Allows us to plan • Time, cost, skills, effort • Allows us to create templates and guidelines • Helps newcomers contribute quickly and effectively • Easier to transfer tasks between people • Plan – Do – Review • Make sure everything got done… got done right • Barely adequate • Good enough… but only just

  4. Outline Development Process Business Case Deploy What? Test How? Build

  5. Plan and Do What? Business Requirements Customer System Requirements How? Supplier Design Specification Unit Specification Build

  6. Traceability • Well-structured text, not prose • Uniquely identify every element • Make sure nothing is missed • Make sure nothing is added What? Business Requirements System Requirements How? Design Specification Unit Specification Build

  7. Testing • Defined objectives • Well-structured text, not prose • Uniquely identify every element • Repeatable steps • Test Strategy defines approach, coverage, etc. • Make sure nothing is missed/added What? Business Requirements User Acceptance System Requirements System Test How? Test Strategy Integration Design Specification Unit Unit Specification Build Peer Review Coding Standards

  8. Test Strategy Test Strategy • How will you be sure the system does what it should? • Types of testing • Static / dynamic • Inspection of results & data • Baseline for comparison / expected results • Automated / manual • Coverage • 100% • Spot checks • How many / which & what • Artefacts & evidence to be archived

  9. Test Strategy - Detail Test Strategy Coding Standards • Units will leave environments as they found them • Aside from planned / designed behaviour • No memory leakage (memory freed-up at appropriate times) • No temporary libraries remain assigned • No temporary data sets remain • No macro variables remain 141 %tharness(testmacro=BestCodeEver); THARNESS: Starting execution of code to be tested NLOBS=2.7481588701 THARNESS: Execution of code to be tested has finished THARNESS: Summarising results THARNESS: Macro variable has been added: scope=GLOBAL name=NLOBS THARNESS: Library assignment has been added: libname=NFIND_ME THARNESS: End of results summary

  10. Deletes its own temporary WORK data sets • Facilitated by the fact that the names of all of the macro’s WORK data sets are prefixed with _THARNESS_ (achieved generically with _&sysmacroname._) • By using the same prefix, the data sets can be deleted at the end of the macro by specifying _THARNESS_: on PROC DATASETS’ DELETE statement • Conditional upon &tidy(for debugging) %if%upcase(%substr(&tidy,1,1)) eq Y %then %do; proc datasets lib=work nolist; delete _&sysmacroname._: ; quit; options notes; %end; %mendtharness;

  11. Approach: Snapshot then Compare • Snapshot of elements of environment taken before and after execution of the macro-under-test, e.g. sashelp.vslib • Comparison of before and after images done with DATA steps and PUT statements (more flexible than PROC COMPARE) data _null_; merge work._&sysmacroname._vslibbefore (in=before) work._&sysmacroname._vslibafter (in=after) end=finish; by libname; retain IssueFound0; if before and not after then do; put "&sysmacroname: Library assignment has been removed: "libname=; IssueFound=1; end; else if not before and after then do; put "&sysmacroname: Library assignment has been added: "libname=; IssueFound=1; end; if finish and not IssueFound then put "&sysmacroname: No library assignment issues found"; run;

  12. Approach: Snapshot then Compare Data work._&sysmacroname._vslibBefore; Set sashelp.vslib; Run; data _null_; merge work._&sysmacroname._vslibbefore (in=before) work._&sysmacroname._vslibafter (in=after) end=finish; by libname; retain IssueFound0; if before and not after then do; put "&sysmacroname: Library assignment has been removed: "libname=; IssueFound=1; end; else if not before and after then do; put "&sysmacroname: Library assignment has been added: "libname=; IssueFound=1; end; if finish and not IssueFound then put "&sysmacroname: No library assignment issues found"; run; THARNESS: Library assignment has been added: libname=NFIND_ME

  13. Echoes all of its parameters upon initiation • Ensures that the values of those parameters taking default values (and hence not specified in the calling program) are known to anybody inspecting the log %macrotharness(testmacro = ,tidy = y ); %put &sysmacroname: Parameters received by this macro are:; %put _local_; %put ; 141 %tharness(testmacro=BestCodeEver); THARNESS: Parameters received by this macro are: THARNESS TESTMACRO BestCodeEver THARNESS TIDY y

  14. Summary • Plan – Do – Review • Barely Adequate • What – How - Build • Traceability (vertical) • Testing • Traceability (horizontal) • Peer review – unit – integration – system - user

  15. NOTECOLON.INFO Andrew Ratcliffe Thank you for listening. Enjoy your evening!

More Related