350 likes | 366 Views
Investigating the causes of large scatter in CFD results in the AIAA Drag Prediction Workshops, focusing on errors and grid convergence issues. The study explores sensitivity to various factors using NSU3D Unstructured Mesh Solver. The study involves grid convergence, comparison with experimental data, modeling errors, grid refinement effects, and sensitivity to distance function evaluation, viscous term formulation, and artificial dissipation.
E N D
Grid Resolution Study of a Drag Prediction Workshop Configuration using the NSU3D Unstructured Mesh Solver Dimitri J. Mavriplis Department of Mechanical Engineering University of Wyoming
Motivation • Both AIAA Drag Prediction Workshops have shown large scatter in CFD results • Grid convergence not achieved • Both AIAA Drag Prediction Workshops have shown generally poor agreement of CFD with experimental results • CL vs. incidence • Moment predictions • Notable exceptions in both workshops • Not well understood
Intricacies of DPW Cases • Generic/Simple Wing Body (Nacelle) configurations • Substantial amounts of flow separation • Appears in sensitive areas (trailing edge) • Indicative of off-design conditions • Particularly challenging for CFD codes • DPW is good test case for sensitivity studies in CFD solvers • More benign attached flow cases should be piece of cake for current RANS solvers
Motivation • Follow on Studies: • E.M. Lee-Rausch et. al.: AIAA 2003-3400 • E.M. Lee-Rausch et. al.: AIAA 2004-0554 • General lack of grid convergence • In many cases, agreement between codes gets worse with increased grid refinement • Issues Raised (running different codes on same grids with same turbulence model): • Modeling vs. discretization error • Effects due to: • Structured vs. unstructured • Cell centered vs. vertex based discretizations • Use of prisms or tetrahedra in boundary layer regions • Upwind versus Artificial Dissipation • Thin layer vs. Full Navier-Stokes terms • Distance function calculation
Motivation • Follow up on DPW and previous studies to better understand sources of error • Determine dominant sources of error through sensitivity studies • Modeling error: • Distance function evaluation methods • Thin Layer vs. Full Navier Stokes • Dissipation Levels (re: upwind flux functions) • Grid Resolution • Goal is not to match experimental data but to better understand importance of various sources of error and establish requirements for grid convergence • Leave turbulence and transition models unchanged • SA model/fully turbulent
Families of Grids • Grid Convergence studies require families of grids derived by globally coarsening or refining an initial grid to maintain similar relative variations in grid resolution • Family of 4 grids: • 1.1 million points • 3.0 million points Original DPW2 grids (VGRIDns) • 9.1 million points • 72 million points Generated by uniform refinement of 9.1 million pt grid • Single grid of different family • 65 million points Generated c/o S. Pirzadeh, NASA Langley, VGRIDns 64bit version on NASA Columbia Supercomputer
Baseline (3M pt) DPW 2 Grid • 4 cells across blunt TE • 7.e-06 chords spacing at wall • y+ ~ 1, 26 cells/layers in boundary layer
Grids Specifications • NSU3D preprocessing converts tetrahedra into prisms in boundary layer regions
Grid Specifications 3.0 million pt grid 72 million pt grid
Grid Specifications 65 million pt grid 72 million pt grid
Grid Specifications 65 million pt grid 72 million pt grid
NSU3D Unstructured RANS Solver • Vertex based discretization • Central difference + Matrix dissipation • Multi-dimensional thin-layer Navier-Stokes • Implicit Point/Line Solver in BLayer Regions • Agglomeration Multigrid • ~500 cycles to converge, 1000 cycles for higher CL cases • Parallelization through MPI and/or OpenMP
NSU3D Convergence and Scalability Mach=0.75, Incidence=0o, Reynolds = 3million 3.0 million point grid: 2.5 hours on 16 Opterons
NSU3D Convergence and Scalability 3 Tflops on 2008 cpus on NASA Columbia 20 minutes for 72 million pt grid (benchmark case)
NSU3D Convergence and Scalability • 3.0 million pts: • 2.5 hours on cluster of 16 Opterons • 10 minutes on 128 cpus of NASA Columbia • Most runs done on Opteron Cluster • 72 million pts: • 4.5 hours on 128 cpus of NASA Columbia • 20 minutes on 2008 cpus of NASA Columbia • Most runs done on 128 cpus of Columbia
Sensitivity Studies • Sensitivity to Distance Function Evaluation • Sensitivity to Viscous Term Formulation • Sensitivity to Artificial Dissipation • Grid Resolution Effect on Above Sensitivities • Mode of Operation: • Fixed Incidence (AoA=0o) • Simpler for grid convergence study • Mach=0.75 and Mach=0.3 • Subsonic case contains no discontinuities • No change in turbulence/transition model
Sensitivity to Distance Function • 3 Distance Function Formulations • Exhaustive Search • Slow but exact, may be discontinuous • Eikonal/Hamilton-Jacobi Equations • Used for level set problems • Fast sweep algorithm • Fast, accurate,smooth, but NON parallel • Poisson Equation • Fast, smooth, parallel • Good accuracy near surface, poor accuracy away from surface
Sensitivity to Distance Function • Mach = 0.75, Reynolds Number =3 million
Sensitivity to Distance Function • Eikonal Equation very close, much faster • Mach = 0.75, AoA = 0o, Reynolds =3 million
Sensitivity to Viscous Term Formulation • NSU3D uses multi-dimensional thin layer approx. • Computed along mesh edges • NSU3D implements Full Navier-Stokes terms • Double loop over edges: (gradients, fluxes) • Extended stencil • Hybrid: Normal 2nd differences as above, Cross terms using double loop • 30% more expensive, less robust: Inexact Jacobian
Sensitivity to Viscous Term Formulation • Less than 0.5% variation in CL on finer grids • Less than 2 counts variation in CD on finer grids
Grid Convergence and Dissipation Sensitivity • Sensitivity to Dissipation • Run with nominal value of dissipation parameter • Run with ½ nominal value of dissipation paramter • Provides estimate of influence of Upwind/Artificial Dissipation, effect of various flux functions, levels of dissipation • Run on sequence of grids (1M,3M,9M,72M) • Fixed Incidence (AoA=0o) • Mach=0.75, Mach=0.3 (no discontinuities) • Expect: CL CD convergenceand reduction in sensitivity to dissipation as grid is refined
Grid Convergence and Dissipation • Drag is grid converging • Lift is somewhat erratic: • better grid convergence at lower dissipation values • Sensitivity to dissipation decreases as expected
Grid Convergence and Dissipation • Drag is grid converging • Sensitivity to dissipation decreases as expected • Friction Drag is essentially grid converged
Additional Results: 65M pt Mesh • Add results from 65M pt mesh • This mesh is not self-similar with others • Different spacings… but approximately same as 72M pt mesh • 6-8 vs 12 point across trailing edge • No spanwise stretching • 976,828 vs 474,926 pts on Aircraft surface • Strictly, should not be plotted on same N-2/3 plot • Useful to see differences with other grids
65M pt mesh Results • 10% drop in CL at AoA=0o: closer to experiment • Drop in CD: further from experiment • Same trends at Mach=0.3 • Little sensitivity to dissipation
65M pt Mesh Results • Most of change in CD comes from Friction Drag • Recall CDV was “fully” converged • Similar results for Mach=0.3, Low sensitivity to dissipation
65M pt Mesh Results • Much better agreement with experiment(CL and CM)
65M pt Mesh Results • Low on Drag • Transition effects remain • Note CDP changes when comparing 65/72M pt results at fixed CL
Surface CP Comparisons • Improved Inboard, more TE separation • Outboard still shows discrepancies
Surface CP Comparisons • Improved Inboard, more TE separation • Outboard still shows discrepancies
Surface CP Comparisons • Improved Inboard, more TE separation • Outboard still shows discrepancies
Surface CP Comparisons • Improved Inboard, more TE separation • Outboard still shows discrepancies
Conclusions • Surprising to see large differences at high resolution • 10% change in CL from 65M to 72M points • In the presence of apparent grid convergence • General trends reproduced by FUN3D • E.M. Lee-Rausch et. al.: AIAA-2005-4842 • Sensitivity to grid resolution/topology dominates other modeling issues investigated herein • Implications for turbulence/transition modeling
Conclusions • Speculative Explanation: • Main difference between fine grids is spanwise stretching at TE • Isotropic cells lead to increase TE separation • Reduces Lift substantially when integrated along span • Little effect of root separation (same resolution in 2 grids) • Further Work: Generate coarser family of meshes from 65M pt mesh • Examine grid convergence between 2 families of grids • State-of-the-art Computers (NASA Columbia) has large effect on advancing state-of-the-art of Computational Aerodynamics