文档库 最新最全的文档下载
当前位置:文档库 › Virtual reality for assembly methods prototyping_ a review

Virtual reality for assembly methods prototyping_ a review

Virtual reality for assembly methods prototyping_ a review
Virtual reality for assembly methods prototyping_ a review

SI:MANUFACTURING AND CONSTRUCTION

Virtual reality for assembly methods prototyping:a review

Abhishek Seth ?Judy M.Vance ?James H.Oliver

Received:18June 2009/Accepted:30December 2009/Published online:22January 2010óSpringer-Verlag London Limited 2010

Abstract Assembly planning and evaluation is an impor-tant component of the product design process in which details about how parts of a new product will be put together are formalized.A well designed assembly process should take into account various factors such as optimum assembly time and sequence,tooling and ?xture requirements,ergo-nomics,operator safety,and accessibility,among others.Existing computer-based tools to support virtual assembly either concentrate solely on representation of the geometry of parts and ?xtures and evaluation of clearances and toler-ances or use simulated human mannequins to approximate human interaction in the assembly process.Virtual reality technology has the potential to support integration of natural human motions into the computer aided assembly planning environment (Ritchie et al.in Proc I MECH E Part B J Eng 213(5):461–474,1999).This would allow evaluations of an assembler’s ability to manipulate and assemble parts and result in reduced time and cost for product design.This paper provides a review of the research in virtual assembly and categorizes the different approaches.Finally,critical

requirements and directions for future research are presented.

Keywords Virtual assembly áCollision detection áPhysics-based modeling áConstraint-based modeling áVirtual reality áHaptics áHuman–computer interaction

1Introduction

Innovation is critical for companies to be successful in today’s global https://www.wendangku.net/doc/be6973804.html,petitive advantage can be achieved by effectively applying new technologies and processes to challenges faced in current engineering design practices.Opportunities encompass all aspects of product development (including ergonomics,manufacture,mainte-nance,product life cycle,etc.)with the greatest potential impact during the early stages of the product design process.Prototyping and evaluation are indispensable steps of the current product creation process.Although computer mode-ling and analysis practices are currently used at different stages,building one-of-a-kind physical prototypes makes the current typical process very costly and time consuming.New technologies are needed that can empower industry with a faster and more powerful decision making process.VR technology has evolved to a new level of sophistication during the last two decades.VR has changed the ways scientists and engineers look at computers for performing mathematical simulations,data visualization,and decision making (Bryson 1996;Eddy and Lewis 2002;Zorriassatine et al.2003;Xianglong et al.2001).VR technology com-bines multiple human–computer interfaces to provide var-ious sensations (visual,haptic,auditory,etc.),which give the user a sense of presence in the virtual world.This enables users to become immersed in a computer-generated

This work was performed by Abhishek Seth while at Iowa State University.

A.Seth (&)

Applied Research,Product Development Center of Excellence,Caterpillar Inc.,Peoria,IL,USA e-mail:abhishekseth@https://www.wendangku.net/doc/be6973804.html,

J.M.Vance áJ.H.Oliver

Department of Mechanical Engineering,Virtual Reality Applications Center,Iowa State University,Ames,IA 50011,USA

e-mail:jmvance@https://www.wendangku.net/doc/be6973804.html, J.H.Oliver

e-mail:oliver@https://www.wendangku.net/doc/be6973804.html,

Virtual Reality (2011)15:5–20DOI 10.1007/s10055-009-0153-y

scene and interact using natural human motions.The ulti-mate goal is to provide an‘‘invisible interface’’that allows the user to interact with the virtual environment as they would with the real world.This makes VR an ideal tool for simulating tasks that require frequent and intuitive manual interaction such as assembly methods prototyping.

Several de?nitions of virtual assembly have been pro-posed by the research community.For example,in1997, Jayaram et al.de?ned virtual assembly as‘‘The use of computer tools to make or‘‘assist with’’assembly-related engineering decisions through analysis,predictive models, visualization,and presentation of data without physical realization of the product or supporting processes’’.Kim and Vance in2003described virtual assembly as the ‘‘ability to assemble CAD models of parts using a three-dimensional immersive,user interface and natural human motion’’.This de?nition included the need for an immer-sive interface and natural interaction as a critical part of virtual assembly.As VR continues to advance,we would like to expand previous de?nitions to provide a more comprehensive description.

Virtual assembly in this paper is de?ned as the capa-bility to assemble virtual representations of physical models through simulating realistic environment behavior and part interaction to reduce the need for physical assembly prototyping resulting in the ability to make more encompassing design/assembly decisions in an immersive computer-generated environment.

2Why virtual assembly?

Assembly process planning is a critical step in product development.In this process,details of assembly opera-tions,which describe how different parts will be put together,are formalized.It has been established that assembly processes often constitute the majority of the cost of a product(Boothroyd and Dewhurst1989).Thus,it is crucial to develop a proper assembly plan early in the design stage.A good assembly plan incorporates consi-derations for minimum assembly time,low cost,ergo-nomics and operator safety.A well-designed assembly process can improve production ef?ciency and product quality,reduce cost and shorten product’s time to market.

Expert assembly planners today typically use tradi-tional approaches in which the three-dimensional(3D) CAD models of the parts to be assembled are examined on two-dimensional(2D)computer screens in order to assess part geometry and determine assembly sequences for a new product.As?nal veri?cation,physical proto-types are built and assembled by workers who identify any issues with either the assembly process or the product design.As assembly tasks get more complicated,such methods tend to be more time consuming,costly and prone to errors.

Computer-aided assembly planning(CAAP)is an active area of research that focuses on development of automated techniques for generating suitable assembly sequences based primarily on intelligent identi?cation and groupings of geometric features(Baldwin et al.1991;de-Mello and Sanderson1989;Zha et al.1998;Sung et al.2001;De Fazio and Whitney1987;Smith et al.2001).These methods rely on detailed information about the product geometry,but they do not account for the expert knowledge held by the assembler that may impact the design process.This knowledge,based on prior experience,is dif?cult to capture and formalize and could be rather extensive(Ritchie et al.1995).Ritchie et al. (Ritchie et al.1999)proposed the use of immersive virtual reality for assembly sequence planning.System functiona-lity was demonstrated using an advanced electromechanical product in an industrial environment.Holt et al.(2004) propose that a key part of the planning process is the inclu-sion of the human expert in the planning.They base their statements on research in cognitive ergonomics and human factors engineering.Leaving the human aspect out of the assembly planning could result in incorrect or inef?cient operations.Another limitation of the computer-aided assembly planning methods is that as the number of parts in the assembly increase,the possible assembly sequences increase exponentially and thus it becomes more dif?cult to characterize criteria for choosing the most suitable assembly sequence for a given product(Dewar et al.1997a).Once again,the human input is critical to arriving at a cost-effec-tive and successful assembly sequence solution.

Modern CAD systems are also used in assembly process planning.CAD systems require the user to identify constraint information for mating parts by manually selecting the mat-ing surfaces,axes and/or edges to assemble the parts.Thus, these interfaces do not re?ect human interaction with com-plex parts.For complex assemblies,such part-to-part speci-?cation techniques make it dif?cult to foresee the impact of individual mating speci?cations on other portions of the assembly process,for example ensuring accessibility for part replacement during maintenance,or assessing the effects of changing the assembly sequences.Such computer-based systems also lack in addressing issues related to ergonomics such as awkward to reach assembly operations,etc.

VR technology plays a vital role in simulating such advanced3D human–computer interactions by providing users with different kinds of sensations(visual,auditory and haptic)creating an increased sense of presence in a computer-generated scene.Virtual assembly simulations allow designers to import concepts into virtual environ-ments during the early design stages and perform assem-bly/disassembly evaluations that would only be possible much later,when the?rst prototypes are https://www.wendangku.net/doc/be6973804.html,ing

virtual prototyping applications,design changes can be incorporated easily in the conceptual design stage,thus optimizing the design process toward Design for Assembly (DFA).Using haptics technology,designers can touch and feel complex CAD models of parts and interact with them using natural and intuitive human motions.Collision and contact forces calculated in real-time can be transmitted to the operator using robotic devices making it possible for him/her to feel the simulated physical contacts that occur during assembly.In addition,the ability to visualize rea-listic behavior and analyze complex human interactions makes virtual assembly simulations ideal for identifying assembly-related problems such as awkward reach angles, insuf?cient clearance for tooling,and excessive part ori-entation during assembly,etc.They also allow designers to analyze tooling and?xture requirements for assembly.

In addition to manufacturing,virtual assembly systems could also be used to analyze issues that might arise during service and maintainability operations such as inaccessi-bility to parts that require frequent replacement,etc.Expert assembly knowledge and experience that is hard to docu-

ment could be captured by inviting experienced assembly workers from the shop?oor to assemble a new design and provide feedback for design changes(Schwartz et al. 2007).Disassembly and recycling factors can also be taken into account during the initial design stages allowing for an environmentally conscious design.Virtual assembly train-ing can provide a platform for of?ine training of assembly workers which is important when assembly tasks are haz-ardous or specially complicated(Brough et al.2007).

In order to simulate physical mockups in an effort to provide a reliable evaluation environment for assembly methods,virtual assembly systems must be able to accu-rately simulate real world interactions with virtual parts, along with their physical behavior and properties(Chrys-solouris et al.2000).To replace or reduce the current prototyping practices,a virtual assembly simulation should be capable of addressing both the geometric and the sub-jective evaluations required in a virtual assembly opera-tion.Boothroyd et al.(1994)describes the more subjective evaluations of assembly as the following:

?Can the part be grasped in one hand?

?Do parts nest or tangle?

?Are parts easy or dif?cult to grasp and manipulate??Are handling tools required?

?Is access for part,tool or hands obstructed?

?Is vision of the mating surfaces restricted?

?Is holding down required to maintain the part orienta-tion or location during subsequent operations?

?Is the part easy to align and position?

?Is the resistance to insertion suf?cient to make manual assembly dif?cult?

If successful,this capability could provide the basis for many useful virtual environments that address various aspects of the product life cycle such as ergonomics, workstation layout,tooling design,off-line training, maintenance,and serviceability prototyping(Fig.1).

3Virtual assembly:challenges

Several technical challenges must be overcome to realize virtual assembly simulations,namely:accurate collision detection,inter-part constraint detection and management, realistic physical simulation,data transfer between CAD and VR systems,intuitive object manipulation(inclusion of force feedback),etc.In the following section,these chal-lenges are described and previous approaches in each area are summarized.

3.1Collision detection

Virtual assembly simulations present a much larger chal-lenge than virtual walkthrough environments as they require frequent human interaction and real time simulation involving complex models.Real world assembly tasks require extensive interaction with surrounding objects including grabbing parts,manipulating them realistically and?nally placing them in the desired position and ori-entation.Thus,for successfully modeling such a complex interactive process,the virtual environment not only needs to simulate visual realism,it also needs to model realistic part behavior of the virtual objects.For example,

graphic Fig.1Applications of a virtual assembly/disassembly simulation

representations of objects should not interpenetrate and should behave realistically when external forces are applied.The?rst step to accomplish this is implementing accurate collision detection among parts(Burdea2000).

Contemporary CAD systems typically used in product development incorporate precise geometric models con-sisting of hierarchical collections of Boundary Represen-tation(B-Rep)solid models bounded with trimmed parametric NURBS surfaces.These representations are typically tessellated for display,and the resulting polygonal graphics representations can be used to detect collisions. However,the relatively high polygon counts required to represent complex part shapes generally result in relatively long computation time to detect collisions.In virtual environments where interactive simulation is critical,fast and accurate collision detection among dynamic objects is a challenging problem.

Algorithms have been developed to detect collisions using different object representations.Several algorithms that use polygonal data for collision detection were designed by researchers at the University of North Carolina and include I-collide(Cohen et al.1995),SWIFT(Ehmann and Lin2000),RAPID(Gottschalk et al.1996),V-collide (Hudson et al.1997),SWIFT??(Ehmann and Lin2001), and CULLIDE(Govindaraju et al.2003).Other methods such as V-Clip(Mirtich1998)and VPS(McNeely et al. 1999)have also been proposed to use in immersive VR applications.A comprehensive review of collision detection algorithms can be found in Lin and Gottaschalk(1998)and Jime′nez et al.2001)and a taxonomy of collision detection approaches can be found in Borro et al.(2005).

Once implemented,collision detection prevents part interpenetration.However,collision detection does not provide feedback to the user regarding how to change position and orientation of parts to align them for com-pleting the assembly operation(Frohlich et al.2000).Two main classi?cations of techniques for implementing part positioning during an assembly include physics-based modeling and constraint-based modeling.Physics-based modeling simulates realistic behavior of parts in a virtual scene.Parts are assembled together with the help of simulated physical interactions that are calculated in real-time.The second technique utilizes geometric constraints similar to those used by modern CAD systems.In this approach,geometric constraints such as concentricity, coplanar surfaces,etc.are applied between parts,thus reducing the degrees-of-freedom and facilitating the assembly task at hand.

3.2Inter-part constraint detection and management

Due to the problems related to physics-based modeling (instability,dif?cult to attain interactive update rates,accuracy etc.),several approaches using geometric con-straints for virtual assembly have been proposed.Con-straint-based modeling approaches use inter-part geometric constraints(typically prede?ned and imported,or de?ned on the?y)to determine relationships between components of an assembly.Once constraints are de?ned and applied,a constraint solver computes the new and reduced degrees-of-freedom of objects and the object’s resulting motion.

A vast amount of research focused on solving systems of geometric constraints exists in the literature.Numerical constraint solver approaches translate constraints into a system of algebraic equations.These equations are then solved using iterative methods such as Newton–Raphson (Light and Gossard1982).Good initial values are required to handle the potentially exponential number of possible solutions.Although solvers using this method are capable of handling large non-linear systems,most of them have dif?culties handling over-constrained and under-con-strained instances(Fudos1995)and are computationally expensive which makes them unsuitable for interactive applications such as virtual assembly(Fernando et al. 1999).Constructive constraint approaches are based on the fact that in principle,most con?gurations of engineering drawings can be solved on a drawing board using standard drafting techniques(Owen1991).In the rule-constructive method,‘‘solvers use rewrite rules for discovery and execution of construction steps’’.Although complex con-straints are easy to handle,exhaustive computation requirements(searching and matching)of these methods make them inappropriate for real world applications(Fudos and Hoffman1997).Examples of this approach are described in Verroust et al.(1992),Sunde(1988)and (Suzuki et al.1990).Graph-constructive approaches are based on analysis of the constraint graph.Based on the analysis,a set of constructive steps are generated.These steps are then followed to place the parts relative to each other.Graph constructive approaches are fast,methodical and provide means for developing robust algorithms(Owen 1991;Fudos and Hoffman1997;Bouma et al.1995;Fudos and Hoffman1995).An extensive review and classi?cation of various constraint solving techniques is presented in Fudos(1995).

3.3Physics-based modeling

The physics-based modeling approach relies on simulating physical constraints for assembling parts in a virtual scene. Physical modeling can signi?cantly enhance the user’s sense of immersion and interactivity,especially in appli-cations requiring intensive levels of manipulation(Burdea 1999).Physics-based algorithms simulate forces acting on bodies in order to model realistic behavior.Such algo-rithms solve equations of motion of the objects at each time

step,based on their physical properties and the forces and torques that act upon them.

Physics-based modeling algorithms can be classi?ed into three categories based on the method used,namely the penalty force method,the impulse method,and the ana-lytical method.In the penalty force method,a spring-damper system is used to prevent interpenetration between models.Whenever a penetration occurs,a spring-damper system is used to penalize it(McNeely et al.1999;Erleben et al.2005).Penalty-based methods are easy to implement and computationally inexpensive;however,they are char-acterized with problems caused by very high spring stiff-ness leading to stiff equations which are numerically intractable(Witkin et al.1990).The impulse-based meth-ods(Hahn1988;Mirtich and Canny1995;Guendelman et al.2003)simulate interactions among objects using collision impulses.Static contacts in this approach are modeled as a series of high-frequency collision impulses occurring between the objects.The impulse-based methods are more stable and robust than penalty force methods. However,these methods have problems handling stable and simultaneous contacts(such as a stack of blocks at rest) and also in modeling static friction in certain cases like sliding(Mirtich1996).The analytical method(Baraff 1989;Baraff1997)checks for interpenetrations.If found, the algorithm backtracks the simulation to the point in time immediately before the interpenetration.Based on contact points,a system of constraint equations is solved to gene-rate contact forces and impulses at every contact point (Baraff1990).The results from this method are very accurate;however,it requires extensive computation time when several contacts occur simultaneously.

Thus,although various algorithms for physics-based modeling have evolved over the years,simulating realistic behavior among complex parts interactively and accurately is still a challenging task.

4Review of virtual assembly applications

Progress in constraint modeling and physics-based mode-ling has supported substantial research activity in the area of virtual assembly simulations.In this paper,we catego-rize these assembly applications as either constraint-based or physics-based systems.

4.1Constraint-based assembly applications

The?rst category consists of systems that use constraints to place parts in their?nal position and orientation in the assembly.Constraints in the context of this research are of two types.The?rst are positional constraints,which are pre-de?ned?nal part positions.The second are geometric constraints that relate part features and are applied when related objects are in proximity.Geometric constraints are useful in precise part positioning tasks in a virtual envi-ronment where physical constraints are absent(Wang et al. 2003;Marcelino et al.2003).Constraint-based methods summarized in Sect.3.2are used to solve for relative object movements.

4.1.1Systems using positional constraints

IVY(Inventor Virtual Assembly)developed by Kuehne and Oliver(1995)used IRIS Open Inventor graphics library from Silicon Graphics and allowed designers to interactively verify and evaluate the assembly character-istics of components directly from a CAD package.The goal of IVY was to encourage designers to evaluate assembly considerations during the design process to enable design-for-assembly(DFA).Once,the assembly was completed,the application rendered a?nal animation of assembly steps in a desktop environment.

The high cost of VR systems encouraged researchers to explore the use of personal computers(PC)for VR-based assembly simulations.A PC-based system‘‘Vshop’’(Fig.2)was developed by Pere et al.(1996)for mechanical assembly training in virtual environments.The research focused on exploring PC-based systems as a low-cost alternative and utilizing commercial libraries for easy cre-ation of interactive VR software.The system implemented bounding-box collision detection to prevent model inter-penetration.The system provided grasping force feedback to the user and recognized gestures using a Rutgers Master II haptic exoskeleton.Hand gesture recognition was used for various tasks like switching on and off navigation and moving forward/backward in the environment.

An experimental study investigating the potential bene?ts of VR environments in supporting assembly planning was conducted by Ye et al.(1999).For

virtual Fig.2VShop user interface

assembly planning,a non-immersive desktop and an im-mersive CAVE(Cruz-Neira et al.1992,1993)environment were evaluated.The desktop VR environment consisted of a Silicon Graphics workstation.The CAVE environment was implemented with an IRIS Performer CAVE interface and provided the subjects with a more immersive sense of virtual assemblies and parts.The experiment compared assembly operations in a traditional engineering environ-ment and immersive and non-immersive VR environments. The three conditions differed in how the assembly was presented and handled.The assembly task was to generate an assembly sequence for an air-cylinder assembly(Fig.3) consisting of34parts.The results from the human subject study concluded that the subjects performed better in VEs than in traditional engineering environments in tasks rela-ted to assembly planning.

Anthropometric data was utilized to construct virtual human models for addressing ergonomic issues that arise during assembly(Bullinger et al.2000).A Head Mounted Display(HMD)was used for stereo viewing,and a data glove was used for gesture recognition.Head and hand tracking was implemented using magnetic trackers.While performing assembly tasks,the users could see their human model in the virtual environment.The system calculated the time and cost involved in assembly and also produced a script?le describing the sequence of actions performed by the user to assemble the product.

An industrial study was performed at BMW to verify assembly and maintenance processes using virtual proto-types(Gomes de Sa et al.1999).A Cyber Touch glove device was used for gesture recognition,part manipulation and for providing tactile force feedback to the user.A proximity snapping technique was used for part placement, and the system used voice input and provided acoustic feedback to provide information about the material pro-perties of the colliding object.Gestures from the glove device were also used for navigating the virtual environ-ment.Five different groups with diverse backgrounds par-ticipated in the user study.It was concluded that force feedback is crucial when performing virtual assembly tasks.

4.1.2Systems using geometric constraints

One of the early attempts at utilizing geometric constraints to achieve accurate3D positioning of solid models was demonstrated by Fa et al.in1993.The concept of allow-able motion was proposed to constrain the free3D manipulation of the solid model.Simple constraints such as against,coincident,etc.were automatically recognized, and the system computed relative motion of objects based on available constraints.

VADE(Virtual Assembly Design Environment;Jayaram et al.1997,1999,2000a,b;Taylor et al.2000)developed in collaboration with NIST and Washington State University utilized constraint-based modeling(Wang et al.2003)for assembly simulations.The system used Pro/Toolkit to import assembly data(transformation matrices,geometric constraints,assembly hierarchy etc.)from CAD to perform assembly operations in the virtual https://www.wendangku.net/doc/be6973804.html,ers could perform dual handed assembly and dexterous manipulation of objects(Fig.4).A CyberGrasp haptic device was used for tactile feedback during grasping.A physics-based algorithm with limited capabilities was later added to VADE for simulating realistic part behavior(Wang et al.2001).A hybrid approach was introduced where object motion is guided by both physical and geometric constraints simul-taneously.Stereo vision was provided by an HMD or an Immersadesk(Czernuszenko et al.1997)https://www.wendangku.net/doc/be6973804.html,-mercial software tools were added to the system to perform ergonomic evaluation during assembly(Shaikh et al.2004; Jayaram et al.2006a).The VADE system was used to conduct industry case studies and demonstrate downstream value of virtual assembly simulations in various applications such as ergonomics,assembly installation,process plan-ning,installation,and serviceability(Jayaram et al.2007).

Different realistic hand grasping patterns involving complex CAD models have been explored by Wan et al. (2004a)and Zhu et al.(2004)using a multimodal system called MIVAS(A Multi-Modal Immersive Virtual Assem-bly System).They created a detailed geometry model of the hand using metaball modeling(Jin et al.2000;Guy and Wyvill1995)and tessellated it to create a graphic repre-sentation,which was texture-mapped with images captured from a real human hand(Wan et al.2004b).A

three-layer Fig.3Presentation of aircylinder assembly in Ye’s application

model (skeletal layer,muscle layer and skin layer)was adapted to simulate deformation in the virtual hand using simple kinematics models.Hand to part collision detection and force computations were performed using fast but less accurate VPS software (McNeely et al.1999),while part to part collision detection was implemented using the RAPID (Gottschalk et al.1996)algorithm.Geometric constraints were utilized in combination with collision detection to calculate allowable part motion and accurate part https://www.wendangku.net/doc/be6973804.html,ers could feel the size and shape of digital CAD models via the CyberGrasp haptic device from Immersion Corporation (https://www.wendangku.net/doc/be6973804.html,/).

Commercial constraint solvers such as D-Cubed (https://www.wendangku.net/doc/be6973804.html,/en_us/products/open/d-cubed/index.shtml )have also been utilized for simulating kinematic behavior in constraint-based assembly simula-tions.Marcelino et al.(2003)developed a constraint man-ager for performing maintainability assessments using virtual prototypes.Instead of importing geometric con-straints from CAD systems using proprietary toolkits,a constraint recognition algorithm was developed which examined part geometries (surfaces,edges etc.)within cer-tain proximity to predict possible assembly constraints.Geometric constraint approach was utilized to achieve real time system performance in a realistic kinematic simulation.The system (Fig.5)imported B-Rep CAD data using Para-solid (https://www.wendangku.net/doc/be6973804.html,/en_us/products/open/parasolid/index.shtml )geometry format.A constraint manager was developed which was capable of validating existing con-straints,determining broken constraints and enforcing existing constraints in the system.The constraint recognition algorithm required extensive model preprocessing steps in which bounding boxes were added to all surfaces of the objects before they could be imported.

The concept of assembly ports (Jung et al.1998;Singh and Bettig 2004)in combination with geometric constraints have been used by researchers for assembly and tolerance analysis.Liu et al.(Liu and Tan 2005)created a system which used assembly ports containing information about the mating part surfaces,for example geometric and tol-erance information,assembly direction and type of port (hole,pin,key etc.).If parts were modi?ed by a design team,the system used assembly port information to analyze if new designs could be re-assembled successfully.Dif-ferent rules were created (proximity,orientation,port type and parameter matching)for applying constraints among parts.Gesture recognition was implemented using a CyberGlove device.A user study was conducted which con?rmed that constraint-based modeling was bene?cial for users when performing precise assembly positioning tasks (Liu and Tan 2007

).

Fig.4VADE usage

scenarios

Fig.5Marcelino’s constraint manager interface

Attempts have also been made at integrating CAD sys-tems with virtual assembly and maintenance simulations (Jayaram et al.1999,2006b).A CAD-linked virtual assembly environment was developed by Wang et al. (2006),which utilized constraint-based modeling for assembly.The desktop-based system ran as a standalone process and maintained communication with Autodesk InventoròCAD software.Low level-of-detail(LOD)proxy representations of CAD models were used for visualization in the virtual environment.The assembly system required persistent communication with the CAD system using proprietary APIs for accessing information such as assembly structure,constraints,B-rep geometry and object properties.The concept of proxy entity was proposed which allowed the system to map related CAD entities (surfaces,edges,etc.)to their corresponding triangle mesh representations present in VR.

Yang et al.(2007)used constraint-based modeling for assembly path planning and analysis.Assembly tree data, geometric data of parts and prede?ned geometric con-straints could be imported from different parametric CAD systems using a special data converter.A data glove device and a hand tracker were used for free manipulation of objects in the virtual environment.The automatic con-straint recognition algorithm activated the pre-de?ned constraints when bounding boxes of the interrelated parts collided.The users were required to con?rm the constraint before it could be applied.These capabilities were applied to the integrated virtual assembly environment(IVAE) system.

4.2Physics-based modeling applications

The second category of applications includes assembly systems that simulate real world physical properties,fric-tion,and contact forces to assemble parts in a virtual environment.These applications allow users to move parts freely in the environment.When a collision is detected, physics-based modeling algorithms are used to calculate subsequent part trajectories to allow for realistic simulation.

Assembly operators working on the shop?oor rely on physical constraints among mating part surfaces for com-pleting assembly tasks.In addition,physical constraint simulation is important during assembly planning as well as maintenance assessments to check if there is enough room for parts and tooling.One of the early attempts at implementing physics-based modeling for simulating part behavior was made by Gupta(Gupta and Zeltzer1995; Gupta et al.1997).The desktop-based system called VEDA (Virtual Environment for Design for Assembly)used a dual Phantomòinterface for interaction and provided haptic, auditory and stereo cues to the user for part interaction.However,the system was limited to render multimodal interactions only among4–5polygons and handled only2D models to maintain an interactive update rate.

Collision detection and physical constraint simulation among complex3D models was attempted by Fro¨hlich et al.(2000).They used CORIOLIS TM(Baraff1995) physics-based simulation algorithm to develop an interac-tive virtual assembly environment using the Responsive Workbench(Kru¨ger and Fro¨hlich1994).Different con?g-urations of spring-based virtual tools were developed to interact with objects.The system implemented the work-bench in its table-top con?guration and supported multiple tracked hands and users to manipulate an object.The system’s update rates dropped below interactive levels when several hundred collisions occurred simultaneously, and at least?ve percent tolerance was necessary to avoid numerical instabilities which sometimes resulted in system failure.

Researchers at the Georgia Institute of Technology uti-lized a similar approach demonstrated by Gupta et al. (1997)to create a desktop-based virtual assembly system called HIDRA(Haptic Integrated Dis/Re-assembly Ana-lysis;Coutee et al.2001;Coutee and Bras2002).This approach used GHOST(General Haptic Open Software Toolkit)from Sensable Technologies(http://www. https://www.wendangku.net/doc/be6973804.html,/)and dual Phantomòcon?guration for part grasping.OpenGL was used for visualization on a2D monitor and V-Clip in conjunction with Q-hull and SWIFT??were used for collision detection.Because the system(Fig.6)treated the user’s?nger tip as a point rather than a surface,users had dif?culty manipulating compli-cated geometries.Also,using GHOST SDK for physical modeling combined with the‘‘polygon soup’’based colli-sion detection of SWIFT??,HIDRA had problems handling non-convex CAD geometry.

Researchers(Kim and Vance2003;Kim and Vance 2004a)evaluated several collision detection and physics-based algorithms and found VPS(McNeely et al.1999) software from The Boeing Company to be the most applicable for handling the rigorous real time

requirements Fig.6Geometry in HIDRA

while operating on complex 3D CAD geometry.The sys-tem utilized approximated triangulated representations of complex CAD models to generate a volumetric represen-tation that was used for collision computations.Four-and six-sided CAVE systems were supported and a virtual arm model was constructed by using multiple position trackers that were placed on the user’s wrist,forearm and upper arm (Fig.7).Dual handed assembly was supported and gesture recognition was done using wireless data glove devices from 5DT Corporation (https://www.wendangku.net/doc/be6973804.html,/).

Techniques developed during this research were expanded to facilitate collaborative assembly (Kim and Vance 2004b )through the internet.A combination of peer-to-peer and client–server network architectures was developed to maintain the stability and consistency of the system data.A ‘‘Release-but-not-released—RNR’’method was developed for allowing computers with different per-formance capabilities to participate in the network.The system architecture required each virtual environment to be connected to a local PC machine to ensure 1kHz haptic update rate for smooth haptic interaction.Volumetric approximation of complex CAD models resulted in a fast but inaccurate simulation (with errors up to *15mm)and thus did not allow low-clearance parts to be assembled.A dual-handed haptic interface (Fig.8)for assembly/disassembly was created by Seth et al.(2005,2006).This interface was integrated into SHARP:System for Haptic Assembly and Realistic Prototyping and allowed users to simultaneously manipulate and orient CAD models to simulate dual-handed assembly operations.Collision force feedback was provided to the user during assembly.Graphics rendering was implemented with SGI Performer,the Open Haptics Toolkit library was used for communi-cating with the haptic devices,and VPS (McNeely et al.1999)for collision detection and physics-based https://www.wendangku.net/doc/be6973804.html,ing VRJuggler (Just et al.1998)as an application

platform,the system could operate on different VR systems con?gurations including low-cost desktop con?gurations,Barco Baron (https://www.wendangku.net/doc/be6973804.html,/entertainment/en/products/product.asp?element=1192),Power Wall,four-sided and six-sided CAVE systems.Different modules were created to address issues related to maintenance (swept volumes),training (record and play)and to facilitate collaboration (networked communication).Industrial applications of this work demonstrated promising results for simulating assembly of complex CAD models from a tractor hitch.This research was later expanded to gain collision detection accuracy at the cost of computation speed for simulating low-clearance assembly.SHARP demonstrated a new approach (Seth et al.2007)by simu-lating physical constraints using by accurate B-Rep data from CAD systems which allowed the system to detect collisions with an accuracy of 0.0001mm.Although physical constraints were simulated very accurately,users could not manipulate parts during very low–clearance scenarios with the required precision because of the noise associate with the 3D input devices.Geometric constraints were utilized in combination with physics to achieve precise part manipulation required for low-clearance assembly.

Garbaya et al.(Garbaya and Zaldivar-Colado 2007)created a physics-based virtual assembly system which used spring-damper model to provide the user with colli-sion and contact forces during the mating phase of an assembly operation.The PhysX òsoftware toolkit was used for collision detection and physically based modeling.Grasping force feedback was provided using a Cyber-Grasp TM haptic device and collision force was provided using CyberForce TM haptic device from

Immersion

Fig.7Data glove in a six-sided

CAVE

Fig.8Dual-handed haptic interface in SHARP

Corporation.An experimental study was conducted to check system effectiveness and user performance in real and virtual environments.The study concluded that user performance increased when inter-part collision forces were rendered to the user when compared to the user performance when only grasping forces were provided.

HAMMS(Haptic Assembly,Manufacturing and Machining System)was developed by researchers at the Heriot-Watt University to explore the use of immersive technology and haptics in assembly planning(Ritchie et al. 2008).The system uses a Phantomòdevice and stereo glasses.The application is based on OpenHaptics Toolkit, VTK and AGEIA PhysXòsoftware.The unique aspect of this application is its ability to log user interaction.This tracking data can be recorded and examined later to gen-erate an assembly procedure.This work is ongoing with future evaluations to be performed.

5Haptic interaction

Today’s virtual assembly environments are capable of simulating visual realism to a very high level.The next big challenge for the virtual prototyping community is simu-lating realistic interaction.Haptics is an evolving techno-logy that offers a revolutionary approach to realistic interaction in VEs.‘‘Haptics means both force feedback (simulating object hardness,weight,and inertia)and tactile feedback(simulating surface contact geometry,smooth-ness,slippage and temperature)’’(Burdea1999).Force cues provided by haptics technology can help designers feel and better understand the virtual objects by supple-menting visual and auditory cues and creating an improved sense of presence in the virtual environment(Coutee and Bras2004;Lim et al.2007a,b).Research has shown that the addition of haptics to virtual environments can result in improved task ef?ciency times(Burdea1999;Volkov and Vance2001).

Highly ef?cient physics-based methods that are capable of maintaining high update rates are generally used for implementing haptic feedback in virtual assembly simula-tions.Various approaches for providing haptic feedback for assembly have been presented in the past which focused on developing new methods for providing tactile(Pere et al. 1996;Jayaram et al.1999,2006b;Wan et al.2004a; Regnbrecht et al.2005),collision(Kim and Vance2004b; Seth et al.2005;Seth et al.2006)and gravitational force feedback(Coutee and Bras2004;Gurocak et al.2002).The high update rate(*1kHz)requirement for effective hap-tics has always been a challenge in integrating this tech-nology.As stated earlier,most physics-based algorithms used highly coarse model representations to maintain the update rate requirements.The resulting lack of part shape accuracy of such approaches presents problems when detailed contact information is necessary.Simulating complex part interactions such as grasping is also demanding as it requires the simulation to detect collisions and generate contact forces accurately for each individual ?nger(Wan et al.2004a;Zhu et al.2004;Jayaram et al. 2006b;Zachmann and Rettig2001).Maintaining update rates for haptic interaction(*1kHz)while performing highly accurate collision/physics computations in complex interactive simulations such assembly remains a challenge for the community.

In addition,there are several limitations of the haptics technology currently available.Non-portable haptic devices such as Sensable Technologies’PHANToMò(https://www.wendangku.net/doc/be6973804.html,/;Massie and Salisbury1994), Immersion’s CyberForce TM(https://www.wendangku.net/doc/be6973804.html,/), Haption Virtuose(https://www.wendangku.net/doc/be6973804.html,/index.php? lang=eng),and Novint Falcon(http://www.novintfalcon. com/)devices(Yang et al.2007)among others(Millman et al.1993;Buttolo and Hannaford1995)have workspace limitations which results in restricted user motion in the environment.Additionally,because these devices need to be stably mounted,their use with immersive virtual envi-ronments becomes unfeasible.

In contrast,wearable haptic gloves and exoskeleton devices such as CyberTouch TM,CyberGrasp TM(http:// https://www.wendangku.net/doc/be6973804.html,/),Rutgers Master II(Bouzit et al. 2002)among others(Gurocak et al.2002)provide a much larger workspace for interaction.However,they provide force feedback only to?ngers and palm and thus are suitable for tasks that involve only dexterous manipula-tions.In addition,the weight and cable attachments of such devices make their use unwieldy.A detailed discussion on haptics issues can be found in(Burdea2000).The chal-lenges presented here among several others must be addressed,before the community can explore the real potential of haptics technology in virtual prototyping.

6CAD-VR data exchange

CAD-VR data exchange is one of the most important issues faced by the virtual prototyping community.CAD systems used by the industry to develop their product models are generally unsuitable for producing optimal representations for VR applications.Most VR applications take advantage of scene-graphs(e.g.,Openscenegraph,OpenSG,OpenGL Performer,etc.)for visualization which are simply hierar-chical data structures comprised of triangulated mesh geometry,spatial transforms,lighting,material properties, and other metadata.Scene graph renderers provide the VR application with methods to exploit this data structure to ensure interactive frame rates.Translating CAD data into a

scene graph requires tessellation of the individual precise parametric surface and/or B-rep solids,often multiple times,to produce several ‘‘level-of-detail’’polygonal rep-resentations of each part.During this translation process,the parametric (procedural modeling history and con-straints)information of the CAD model generally does not get imported into the VR application.In addition,pre-existing texture maps may not be included in these visually optimized model representations.In virtual assembly simulations,geometric constraint-based applications that depend on parametric model de?nitions to de?ne inter-part constraint relationships generally have to deal with two representations of the same model:one for visualization and another for constraint modeling algorithms for per-forming assembly.Similarly,physics modeling applica-tions also use dual model representations:high-?delity model for visualization and a coarser representation used

for interactive physics calculations (Fro

¨hlich et al.2000;Seth et al.2005).

Commercial CAD systems (for example AutoCAD,UGS,Dassault Systems,etc.)have made various attempts to embed capabilities for immersive and desktop stereo visu-alization into available commercial software to some degree.Attempts have also been made by academia to provide haptic interaction and immersive visualizations for assembly/disassembly applications with commercial CAD systems (Jayaram et al.2006b ;Wang et al.2006).Thus,although addressed to some degree by industry and acade-mia,there is still no general non-proprietary way to convert CAD assemblies into a representation suitable for VR.Additionally,today’s VR applications have matured to a level where they provide users with the ability to identify meaningful design changes;however,translating these changes back to CAE applications (such as CAD systems)is currently not possible.The efforts mentioned earlier represent a promising basis for this research,but as yet,it remains a major bottleneck to broader adoption of VR.

7Summary

Many virtual assembly applications have been developed by various research groups,each with different features and

capabilities.The review in the previous section indicated that initial efforts in simulating assembly used pre-de?ned transformation matrices of parts for positioning in the virtual scene.In such systems,as users moved parts in the environment they were snapped in place based on collision or proximity criteria (Dewar et al.1997b ;Fig .9).Most of the early applications did not implement collision detection among objects which allowed them to interpenetrate during the simulation.

Later,researchers used pre-de?ned geometric-constraint relationships which were imported from a CAD system for assembling parts.Here,the pre-de?ned constraints were activated when related parts came close to each other in the environment.Once geometric-constraints were recognized,constrained motion could be visualized between parts which were then assembled using pre-de?ned ?nal position (Jayaram et al.1999).Constraint-based approaches have shown promising results in the past.They present lower computation and memory requirements when compared to physics-based methods.In addition,when combined with accurate models (e.g.,parametric surface representations,or B-Rep solids)constraint-based approaches allow users to manipulate and position parts in an assembly with very high ?delity.However,some of these applications required special CAD toolkits to extract relevant CAD metadata (Fig.10)that was required for preparing an assembly scenario (Jayaram et al.1999;Wan et al.2004a ;Chen et al.2005).These special data requirements and their depen-dence on speci?c CAD systems prevented widespread acceptance of these applications.Many constraint-based virtual assembly systems also incorporated collision detection between models to prevent model interpenetra-tion during assembly.Advanced constraint-based methods were successful in identifying,validating and applying constraints on-the-?y and thus did not require importing prede?ned CAD constraints (Marcelino et al.2003;Zhang et al.2005).Although systems using constraint-based modeling prove successful in simulating object’s kine-matics for assembly;simulating realistic behavior among objects involving physical constraints and rigid body dynamics is not possible.

Other research incorporated simulation of the real world physical behavior of parts (Fig.11).

Physics-based

Fig.9Data transfer in positional constraint applications

methods allow for testing scenarios similar to those pos-sible only by physical mock-ups by calculating part tra-jectories subsequent to collisions,possibly incorporating friction,gravity,and other forces that act on the objects.Physics-based solvers generally sacri?ce computation accuracy to keep the update rate of the visual simulation

realistic (Jime

′nez et al.2001).Most previous efforts used a simpli?ed and approximated polygon mesh representations of CAD models for faster collision and physics calcula-tions.Some of these efforts generated even coarser repre-sentations by using cubic voxel elements for physics and collision calculations (McNeely et al.1999;Garcia-Alonso et al.1994;Kaufman et al.1993).Assembly con?gurations like a tight peg in a hole caused several hundreds of col-lisions to occur which often resulted in numerical insta-bilities in the system (Fro

¨hlich et al.2000).Due to these limitations,very few attempts rely on simulating physical constraints for assembly/disassembly simulations.

In addition,physics-based methods also lay the foun-dation for the implementation of haptic interfaces for vir-tual prototyping applications.Such haptic interfaces allow users to touch and feel virtual models that are present in the simulation.Haptic interfaces require much higher update rates of *1kHz which results in trade-offs in accuracy of collision and physics computations.In order to complete assembly tasks with tight tolerances,nominal part size modi?cation may be required (Baraff 1995;Seth et al.2006).However,because assembly operations require mating with small clearance,it is generally not possible to assemble low-clearance parts with actual dimensions using physics-based methods.The demand for highly accurate physics/collision results while maintaining simulation interactivity is still a challenge for the community.In

prototyping applications like virtual assembly,attempts have been made to provide collision and tactile forces to the users for more intuitive interaction with the environ-ment (Zhu et al.2004;Coutee et al.2001;Kim and Vance 2004b ;Seth et al.2005,2006).

8Discussion and future directions

Collision detection algorithms unquestionably form the ?rst step toward building a virtual assembly simulation system.Although they add to simulation realism by pre-venting part interpenetration;collision detection alone does not model part behavior or de?ne relative part trajectories necessary to facilitate the assembly operation.Part inter-action methods are key to a successful immersive virtual assembly experience.

In general,while constraint-based approaches provide capabilities for precise part positioning in VEs;physics-based approaches,on the other hand,enable virtual mock-ups to behave as their physical counterparts.Identifying physical constraints among an arbitrary set of complex CAD models in a dynamic virtual simulation is a compu-tationally demanding challenge.Collision and physics responses need to be calculated fast enough to keep up with the graphics update rate (*30Hz)of the simulation.Both of these approaches serve different purposes which are crucial in making a virtual assembly simulation successful.A research direction that appears promising would be to develop a hybrid method by combining physics-based and constraint-based algorithms.The resulting virtual assembly application would be able to simulate realistic environment behavior for enhanced sense of presence and would also

be

Fig.10Data transfer in geometric constraint-based

applications

Fig.11Data transfer in physics-based applications

able to position parts precisely in a given assembly (Table1).An attempt has been made to implement phys-ics-based algorithm with limited capabilities to an existing constraint-based assembly system(Wang et al.2001). However,limitations of the physics algorithm,part snap-ping and excessive metadata requirements using a CAD system dependent toolkit prevented its widespread impact.

Such an approach would incorporate physics-based methods for simulating realistic part behavior combined with automatic constraint identi?cation,application and haptic interaction.Constraint-based methods would come into play when low-clearance assembly needs to be per-formed to allow for precise movement of parts into their ?nal position.The challenge in this approach is that physics-based methods should be able to take into account the presence of a geometric constraint and the‘‘hybrid solver’’should be able to calculate part trajectories in such a way that both physical and geometric constraints are satis?ed at any given point of time.

As the technology progresses,the cost of computing and visualization technology will continue to fall as their capabilities increase.It will soon be possible to utilize this power to integrate faster and more accurate algorithms into virtual assembly simulations that will be capable of handling large assemblies with thousands of parts while incorporating physically accurate part behavior with high-?delity visual and haptic interfaces.

References

Baldwin DF,Abell TE,Lui M-CM,De Fazio TL,Whitney DE(1991) An integrated computer aid for generating and evaluating assembly sequences for mechanical products.IEEE Trans Rob Autom7(1):78–94

Baraff D(1989)Analytical methods for dynamic simulation of non-penetrating rigid https://www.wendangku.net/doc/be6973804.html,puter Graphics23(3):223–232 Baraff D(1990)Curved surfaces and coherence for non-penetrating rigid body https://www.wendangku.net/doc/be6973804.html,puter Graphics24(4):19–28

Baraff D(1995)Interactive simulation of solid rigid https://www.wendangku.net/doc/be6973804.html,put Graph Appl15(3):63–75

Baraff D(1997)Physically based modeling:principles and practice (online Siggraph97course note),https://www.wendangku.net/doc/be6973804.html,/ *baraff/sigcourse/Boothroyd G,Dewhurst P(1989)Product design for assembly.

McGraw-Hill,New York

Boothroyd G,Dewhurst P,Knight W(1994)Product design for manufacture and assembly.Marcel Dekker,New York

Borro D,Hernantes J,Garcia-Alonso A,Matey L(2005)Collision problem:characteristics for a taxonomy.In:Proceedings of the ninth international conference on information visualisation (IV’05)

Bouma W,Fudos I,Hoffman CM,Cai J,Paige R(1995)A geometric constrain https://www.wendangku.net/doc/be6973804.html,put Aided Design27(6):487–501

Bouzit M,Popescu G,Burdea GC,Boian R(2002)The Rutgers master II-ND force feedback glove.In:HAPTICS2002:haptic interfaces for virtual environment and teleoperator systems.Orlando,FL Brough JE,Schwartz M,Gupta SK,Anand DK,Kavetsky R, Petterson R(2007)Towards the development of a virtual environment-based training system for mechanical assembly operations.Virtual Real11(4):189–206

Bryson S(1996)Virtual reality in scienti?c https://www.wendangku.net/doc/be6973804.html,mun ACM39(5):62–71

Bullinger HJ,Richer M,Seidel KA(2000)Virtual assembly planning.

Hum Factors Ergon Manuf10(3):331–341

Burdea GC(1999a)Invited review:the synergy between virtual reality and robotics.IEEE Trans Rob Autom15(3):400–410 Burdea GC(1999b)Invited review:the synergy between virtual reality and robotics.IEEE Trans Rob Autom15(3):400–410 Burdea GC(2000)Haptics issues in virtual environments.In: Computer graphics international.Geneva,Switzerland

Buttolo P,Hannaford B(1995)Pen-based force display for precision manipulation in virtual environments.In:IEEE virtual reality annual international symposium.Research Triangle Park,NC, USA

Chen X,Xu N,Li Y(2005)A virtual environment for collaborative assembly.In:Second international conference on embedded software and systems(ICESS’05)

Chryssolouris G,Mavrikios D,Fragos D,Karabatsou V(2000)A virtual reality-based experimentation environment for the veri-?cation of human-related factors in assembly processes.Rob Comp Integr Manuf16:267–276

Coutee AS,Bras B(2002)Collision detection for virtual objects in a haptic assembly and disassembly simulation environment.In: ASME design engineering technical conferences and computers and information in engineering conference(DETC2002/CIE-34385).Montreal,Canada

Cohen JD et al(1995)I-COLLIDE:an interactive and exact collision detection system for large-scale environments.In:The1995 ACM international3D graphics conference,pp189–196 Coutee AS,Bras B(2004)An experiment on weight sensation in real and virtual environments.In:ASME design engineering techni-cal conferences and computers and information in engineering conference(DETC2004-57674).Salt Lake City,Utah,USA Coutee AS,McDermott SD,Bras B(2001)A haptic assembly and disassembly simulation environment and associated

Table1Comparison of assembly simulation methods

Collision

detection

Constraint-

based

methods

Physics-

based

methods

Hybrid method

(collision?geometric

constraint?physics

modeling)

Prevent part interpenetration X X X

Realistic part behavior X X

Precise part movement X X

Low computational load X

Haptic(collision/tactile)feedback X X

computational load optimization techniques.ASME Trans J Comput Inf Sci Eng1(2):113–122

Cruz-Neira C,Sandin DJ,DeFanti TA,Kenyon R,Hart JC(1992)The CAVE,audio visual experience automatic virtual environment.

In:Communications of the ACM,pp64–72

Cruz-Neira C,Sandin D,DeFanti T(1993)Surround-screen projec-tion-based virtual reality:the design and implementation of the CAVE.In:Proceedings of SIGGRAPH,vol93,pp135–142 Czernuszenko M,Pape D,Sandin D,DeFanti T,Dawe GL,Brown MD(1997)ImmersaDesk and in?nity wall projection-based virtual reality https://www.wendangku.net/doc/be6973804.html,puter Graphics31(2):46–49

De Fazio TL,Whitney DE(1987)Simpli?ed generation of all mechanical assembly sequences.IEEE J Rob Autom3(6):640–658

de-Mello LSH,Sanderson AC(1989)A correct and complete algorithm for the generation of mechanical assembly sequences.

In:IEEE international conference on robotics and automation.

Scottsdale,AZ,USA

Dewar RG,Carpenter ID,Ritchie JM,Simmons JEL(1997a) Assembly planning in a virtual environment.In:Proceedings of Portland international conference on management of engi-neering and technology(PICMET97).IEEE Press,Portland,OR Dewar RG,Carpenter ID,Ritchie JM,Simmons JEL(1997b) Assembly planning in a virtual environment.In:Proceedings of Portland international conference on management of engi-neering and technology(PICMET97).IEEE Press,Portland,OR Eddy J,Lewis KE(2002)Visualization of multidimensional design and optimization data using cloud visualization.In:ASME design engineering technical conferences and computers and information in engineering conference(DETC2002/DAC-34130),Montreal,Canada

Ehmann SA,Lin MC(2000)SWIFT:accelerated proximity queries between convex polyhedra by multi-level Voronoi marching.

Technical report,Computer Science Department,University of North Carolina at Chapel Hill

Ehmann SA,Lin MC(2001)Accurate and fast proximity queries between polyhedra using surface decomposition.Eurograph Comput Graph Forum20(3)

Erleben K,Sporring J,Henriksen K,Dohlmann H(2005)Physics-based animation,1st edn.Charles River Media,Hingham,p817 Fa M,Fernando T,Dew PM(1993)Direct3D manipulation for constraint-based solid https://www.wendangku.net/doc/be6973804.html,put Graph Forum 12(3):237–248

Fernando T,Murray N,Tan K,Wilmalaratne P(1999)Software architecture for a constraint-based virtual environment.In: Proceedings of the ACM symposium on virtual reality software and technology.London,UK

Frohlich B,Tramberend H,Beers A,Agarawala M,Baraff D(2000) Physically-based modeling on the responsive workbench.In: IEEE Virtual reality conference

Fro¨hlich B,Tramberend H,Beers A,Agarawala M,Baraff D(2000) Physically-based modeling on the responsive workbench.In: IEEE virtual reality conference

Fudos I(1995)Constraint solving for computer aided design.In: Computer sciences,Purdue University,p107

Fudos I,Hoffman CM(1995)Correctness proof of a geometric constraint solver.Int J Comput Geom Appl6(4):405–420 Fudos I,Hoffman CM(1997)A graph constructive approach to solving system of geometric constraints.ACM Trans Graph16(2):179–216

Garbaya S,Zaldivar-Colado U(2007)The affect of contact force sensations on user performance in virtual assembly tasks.Virtual Real11(4):287–299

Garcia-Alonso A,Serrano N,Flaquer J(1994)Solving the collision detection problem.IEEE Comput Graph Appl14(3):36–43Gomes de Sa A,Zachmann G(1999)Virtual reality as a tool for veri?cation of assembly and maintenance https://www.wendangku.net/doc/be6973804.html,put Graph23:189–403

Gottschalk S,Lin MC,Manocha DD(1996)OBB-Tree:a hierarchical structure for rapid interference detection.ACM SIGGRAPH’96, pp171–180

Gottschalk S,Lin MC,Manocha D(1996)OBBTree:a hierarchical structure for rapid interference detection.In:23rd Annual conference on computer graphics and interactive techniques Govindaraju NK,Redon S,Lin MC,Manocha D(2003)CULLIDE: interactive collision detection between complex models in large environments using graphics hardware.In:Proceedings of ACM SIGGRAPH/eurographics workshop on graphics hardware, pp25–32

Guendelman E,Bridson R,Fedkiw RP(2003)Nonconvex rigid bodies with stacking.ACM Trans Comput Graph22(3):871–879 Gupta R,Zeltzer D(1995)Prototyping and design for assembly analysis using multimodal virtual environments.In:Proceedings of ASME computers in engineering conference and the engi-neering database symposium.Boston,MA

Gupta R,Whitney D,Zeltzer D(1997)Prototyping and design for assembly analysis using multimodal virtual https://www.wendangku.net/doc/be6973804.html,-put Aided Des29(8):585–597(Special issue on VR in CAD) Gurocak H,Parrish B,Jayaram S,Jayaram U(2002)Design of a haptic device for weight sensation in virtual environments.In: ASME design engineering technical conferences and computers and information in engineering conference(DETC2002/CIE-34387).Montreal,Canada

Guy A,Wyvill B(1995)Controlled blending for implicit surfaces using a graph.In:Implicit surfaces.Grenoble,France

Hahn JK(1988)Realistic animation of rigid https://www.wendangku.net/doc/be6973804.html,puter Graphics22(4):299–308

Holt PO,Ritchie JM,Day PN,Simmons JEL,Robinson G,Russell GT,Ng FM(2004)Immersive virtual reality in cable and pipe routing:design metaphors and cognitive ergonomics.ASME J Comput Inf Sci Eng4(3):161–170

Hudson T et al(1997)V-COLLIDE:accelerated collision detection for VRML.In:Proceedings of the second symposium on virtual reality modeling language,pp119–125

Jayaram S,Connacher HI,Lyons KW(1997)Virtual assembly using virtual reality https://www.wendangku.net/doc/be6973804.html,put Aided Design29(8):575–584 Jayaram S,Jayaram U,Wang Y,Tirumali H,Lyons K,Hart P(1999) VADE:a virtual assembly design https://www.wendangku.net/doc/be6973804.html,put Graph Appl19(6):44–50

Jayaram U,Tirumali H,Jayaram S(2000a)A tool/part/human interaction model for assembly in virtual environments.In: ASME design engineering technical conferences2000(DETC 2000/CIE-14584).Baltimore,MD

Jayaram S,Jayaram U,Wang Y,Lyons K(2000b)CORBA-based Collaboration in a Virtual Assembly Design Environment.In: ASME design engineering technical conferences and computers and information in engineering conference(DETC2000/CIE-14585).Baltimore,MD

Jayaram U,Jayaram S,Shaikh I,Kim Y,Palmer C(2006a) Introducing quantitative analysis methods into virtual environ-ments for real-time and continuous ergonomic evaluations.

Comput Ind57(3):283–296

Jayaram S,Joshi H,Jayaram U,Kim Y,Kate H,Varoz L(2006b) Embedding haptic-enabled tools in CAD for training applica-tions.In:ASME design engineering technical conferences and computers and information in engineering conference (DETC2006-99656).Philadelphia,PA

Jayaram S,Jayaram U,Kim Y,DeChenne C,Lyons K,Palmer C, Mitsui T(2007)Industry case studies in the use of immersive virtual assembly.Virtual Real11(4):217–228

Jime′nez P,Thomas F,Torras C(2001)3D collision detection:a https://www.wendangku.net/doc/be6973804.html,put Graph25(2):269–285

Jin X,Li Y,Peng Q(2000)General constrained deformations based on generalized https://www.wendangku.net/doc/be6973804.html,put Graph24(2)

Jung B,Latoschik M,Wachsmuth I(1998)Knowledge-based assembly simulation for virtual prototype modeling.In:Pro-ceedings of the24th annual conference of the IEEE industrial electronics society.Aachen,Germany

Just C,Bierbaum A,Baker A,Cruz-Neira C(1998)VR juggler:a framework for virtual reality development.In:2nd Immersive projection technology workshop(IPT98)CD-ROM.Ames,IA Kaufman A,Cohen D,Yagle R(1993)Volume graphics.IEEE Comput26(7):51–64

Kim CE,Vance JM(2003)Using Vps(Voxmap Pointshell)as the basis for interaction in a virtual assembly environment.In: ASME design engineering technical conferences and computers and information in engineering conference(DETC2003/CIE-48297).ASME,Chicago,IL

Kim CE,Vance JM(2004a)Collision detection and part interaction modeling to facilitate immersive virtual assembly methods.

ASME J Comput Inf Sci Eng4(1):83–90

Kim CE,Vance JM(2004b)Development of a networked haptic environment in VR to facilitate collaborative design using Voxmap Pointshell(VPS)software.In:ASME design engineering technical conferences and computers and information in engi-neering conference(DETC2004/CIE-57648).Salt Lake City,UT Kru¨ger W,Fro¨hlich B(1994)The responsive https://www.wendangku.net/doc/be6973804.html,put Graph Appl14(3):12–15

Kuehne R,Oliver J(1995)A virtual environment for interactive assembly planning and evaluation.In:Proceedings of ASME design automation conference.Boston,MA,USA

Light R,Gossard D(1982)Modi?cation of geometric models through variational https://www.wendangku.net/doc/be6973804.html,put Aided Design14(4):209–214 Lim T,Ritchie JM,Corney JR,Dewar RG,Schmidt K,Bergsteiner K (2007a)Assessment of a haptic virtual assembly system that uses physics-based interactions.In:Proceedings of the2007IEEE international symposium on assembly and manufacturing.IEEE, Ann Arbor,MI

Lim T,Ritchie JM,Dewar RG,Corney JR,Wilkinson P,Calis M, Desmulliez M,Fang J-J(2007b)Factors affecting user perfor-mance in haptic assembly.Virtual Real11(4):241–252

Lin M,Gottaschalk S(1998)Collision detection between geometric models:a survey.In:Proceedings of IMA conference on mathematics of surfaces

Liu Z,Tan J(2005)Virtual assembly and tolerance analysis for collaborative design.In:9th International conference on com-puter supported cooperative work in design2005.Coventry,UK Liu Z,Tan J(2007)Constraint behavior manipulation for interactive assembly in a virtual environment.Int J Adv Manuf Technol 32(7–8):797–810

Marcelino L,Murray N,Fernando T(2003)A constraint manager to support virtual https://www.wendangku.net/doc/be6973804.html,put Graph27(1):19–26 Massie T,Salisbury K(1994)The PHANToM haptic interface:a device for probing virtual objects.In:Proceedings of the ASME winter annual meeting,symposium on haptic interfaces for virtual environment and teleoperator systems.Chicago,IL McNeely WA,Puterbaugh KD,Troy JJ(1999)Six degree-of-freedom haptic rendering using voxel sampling.In:SIGGRAPH99 conference proceedings,annual conference series,Los Angeles, CA

Millman PA,Stanley M,Colgate JE(1993)Design of a high performance haptic interface to virtual environments.In:IEEE virtual reality annual international symposium.Seattle,WA Mirtich B(1996)Impulse-based dynamic simulation of rigid body systems.In:Computer science.University of California at Berkeley,p246Mirtich B(1998)V-Clip:fast and robust polyhedral collision detection.ACM Trans Graph17(3):177–208

Mirtich B,Canny J(1995)Impulse-based simulation of rigid bodies.

In:Symposium on interactive3D graphics

Owen JC(1991)Algebraic solution for geometry from dimensional constraints.In:ACM symposium foundations of solid modeling, ACM,Austic,TX

Pere E,Langrana N,Gomez D,Burdea G(1996)Virtual mechanical assembly on a PC-based system.In:ASME design engineering technical conferences and computers and infor-mation in engineering conference(DETC1996/DFM-1306).

Irvine,CA

Regnbrecht H,Hauber J,Schoenfelder R,Maegerlein A(2005) Virtual reality aided assembly with directional Vibro-Tactile feedback.In:Proceedings of the3rd international conference on computer graphics and interactive techniques in Australasia and South East Asia.Dunedin,New Zealand

Ritchie JM,Simmons JEL,Carpenter ID,Dewar RG(1995)Using virtual reality for knowledge elicitation in a mechanical assem-bly planning environment.In:Proceedings of12th conference of the Irish manufacturing committee

Ritchie JM,Dewar RG,Simmons JEL(1999)The generation and practical use of plans for manual assembly using immersive virtual reality.Proc I MECH E Part B J Eng213(5):461–474 Ritchie JM,Lim T,Sung RS,Corney JR,Rea H(2008)The analysis of design and manufacturing tasks using haptic and immersive VR:some case studies.In:Product engineering.Springer,The Netherlands,pp507–522

Schwartz M,Gupta SK,Anand DK,Kavetsky R(2007)Virtual mentor:a step towards proactive user monitoring and assistance during virtual environment-based training.In:Performance metrics for intelligent systems workshop(PerMIS07).Wash-ington,DC,USA

Seth A,Su HJ,Vance JM(2005)A desktop networked haptic VR interface for mechanical assembly.In:ASME international mechanical engineering congress&exposition(IMECE2005-81873).Orlando,FL,USA

Seth A,Su HJ,Vance JM(2006)SHARP:a system for haptic assembly&realistic prototyping.In:ASME design engineering technical conferences and computers and information in engi-neering conference(DETC2006/CIE-99476).Philadelphia,PA, USA

Seth A,Vance JM,Oliver JH(2007)Combining geometric constraints with physics modeling for virtual assembly using SHARP.In:ASME design engineering technical conferences and computers and information in engineering conference (DETC2007/CIE-34681).Las Vegas,NV,USA

Shaikh I,Jayaram U,Jayaram S,Palmer C(2004)Participatory ergonomics using VR integrated with analysis tools.In:2004 winter simulation conference.Washington,DC

Singh B,Bettig B(2004)Port-compatibility and connectability based assembly design.J Comput Inf Sci Eng4(3):197–205

Smith SS-F,Smith G,Liao X(2001)Automatic stable assembly sequence generation and evaluation.J Manuf Syst20(4):225–235 Sunde G(1988)Speci?cation of shape by dimensions and other geometric constraints.In:Geometric modeling for CAD appli-cations.North Holland IFIP

Sung RCW,Corney JR,Clark DER(2001)Automatic assembly feature recognition and disassembly sequence generation.J Com-put Inf Sci Eng1(4):291–299

Suzuki H,Ando H,Kimura F(1990)Variation of geometries based on

a geometric-reasoning https://www.wendangku.net/doc/be6973804.html,put Graph14(2):211–224 Taylor F,Jayaram S,Jayaram U(2000)Functionality to facilitate assembly of heavy machines in a virtual environment.In:ASME design engineering technical conferences(DETC2000/CIE-14590).Baltimore,MD

Verroust A,Schonek F,Roller D(1992)Rule-oriented method for parameterized computer-aided https://www.wendangku.net/doc/be6973804.html,put Aided Design 24(3):531–540

Volkov S,Vance JM(2001)Effectiveness of haptic sensation for the evaluation of virtual prototypes.ASME J Comput Inf Sci Eng 1(2):123–128

Wan H,Gao S,Peng Q,Dai G,Zhang F(2004a)MIVAS:a multi-modal immersive virtual assembly system.In:ASME design engineering technical conferences and computers and informa-tion in engineering conference(DETC2004/CIE-57660).Salt Lake City,UT

Wan H,Luo Y,Gao S,Peng Q(2004b)Realistic virtual hand modeling with applications for virtual grasping.In:2004ACM SIGGRAPH international conference on virtual reality contin-uum and its applications in industry,pp81–87

Wang Y,Jayaram S,Jayaram U,Lyons K(2001)Physically based modeling in virtual assembly.In:ASME design engineering technical conferences and computers and information in engi-neering conference(DETC2001/CIE-21259).Pittsburg,PA Wang Y,Jayaram U,Jayaram S,Shaikh I(2003)Methods and algorithms for constraint based virtual assembly.Virtual Real 6:229–243

Wang QH,Li JR,Gong HQ(2006)A CAD-linked virtual assembly environment.Int J Prod Res44(3):467–486

Witkin A,Gleicher M,Welch W(1990)Interactive dynamics.

Computer Graphics24(2):11–22

Xianglong Y,Yuncheng F,Tao L,Fei W(2001)Solving sequential decision-making problems under virtual reality simulation

system.In:Winter simulation conference proceedings,Arling-ton,Virginia

Yang R,Wu D,Fax X,Yan J(2007)Research on constraint-based virtual assembly technologies.Front Mech Eng China2(2):243–249

Ye N,Banerjee P,Banerjee A,Dech F(1999)A comparative study of virtual assembly planning in traditional and virtual environ-ments.IEEE Trans Syst Man Cybern Part C Appl Rev 29(4):546–555

Zachmann G,Rettig A(2001)Natural and robust interaction in virtual assembly simulation.In:8th ISPE international conference on concurrent engineering:research and applications.Anaheim,CA Zha XF,Lim SYE,Fok SC(1998)Integrated intelligent design and assembly planning:a survey.Int J Adv Manuf Technol 14(9):664–685

Zhang Y,Sotudeh R,Fernando T(2005)The use of visual and auditory feedback for assembly task performance in a virtual environment.In:Proceedings of the21st spring conference on computer graphics.Budmerice,Slovakia

Zhu Z,Gao S,Wan H,Luo Y,Yang W(2004)Grasp identi?cation and multi-?nger haptic feedback for virtual assembly.In:ASME design engineering technical conferences and computers and information in engineering conference(DETC2004/CIE-57718).

ASME,Salt Lake City,Utah,USA

Zorriassatine F,Wykes C,Parkin R,Gindy N(2003)A survey of virtual prototyping techniques for mechanical product develop-ment.Inst Mech Eng Part B J Eng Manuf217(4):513–530

翻译技巧和经验第17期Virtual reality 该如何翻译

近一个时期来,Virtual Reality (VR)一词在报刊等新闻媒体上频频出现,十分活跃,从而也引出了对VR 该怎么好的问题。据《光明日报》载,VR 的译名计有:“虚拟实在”(1996 年,10 月28 日)“临境”、“虚实”、“电象”、“虚拟境象”以及“灵境”(1997 年1 月16 日);此外,还有最常见的“虚拟现实”(如《文汇报》1997 年3 月12 日“小辞典”)。 对VR 究竟怎么理解和怎么翻译,笔者不敢妄下结论。不过,从最近一期外刊上读到的一篇文章(载于 Jane's DEFENCE'97,作者为 Ian Strachan,Editor,Jane's Simulationand Training Systems)在这两方面都可给人以不小的启发。为帮助说明问题,现将此文中有关的部分摘录于下。文章标题和引语是: VIRTUALLY REAL Virtual reality is rapidly becoming the training tool o f the 21stCentury - but what exactly is it? 将此翻译过来大致是: 可说是真的Virtual reality 正在迅速成为二十一世纪的训练器具--但它确切的含义是什么呢? 文中在讲到 virtual 一词的定义时是这样说的:One dictionary definition of "virtual" is"something which is unreal but can be considered as being real for some purposes." 文章接着对 VR 作了几种实质性的释义: Cyber-something? To some, VR is putting on a Cyber-helmet (whatever that is) and involvestotal immersion (whatever that means) in cyberspace (where ver that is). Collimation? To some, VR is a view through a collimated display system. Tactile sensors? To some, VR implies tactile simulation as well a visual syst em. 对以上这些解释或释义,Strachan 认为它们都是对的:Which of these interpretations of VR is right? Ibelieve that they all are.(对VR 的这些诠释中哪个是对的?我相信它们都是对的。)为什么都对?Strachan 说那是因为 English is a living language and insistence on any preciseinterpretation of VR is narrow and not in accordance with what has already become commo nusage.(英语是一门活的语言,而坚持对 VR 作任何精确的诠释是狭隘观念,也不符合约定俗成。) 通过以上介绍,我们似乎能够得出这样的结论:各位专家学者所给的译名都是对的。因为,汉语也是一门活的语言,坚持对 Virtual Reality 作任何精确的翻译是狭隘观念。但是,另一方面,则仍有必要从以上译名中挑选出一两个最好的来,或者也可以另起炉灶,如果还有更好的话。那么,我们认为“虚拟现实”和“虚拟实在”与原文更加“形似”。其它几个译名虽各有千秋,但总嫌“神似”有余,而“形似”不足。词和词语的翻译同句子和篇章一样应力求“神形兼具”。在这种情况下,找到一个最理想的译名实属必要。 借鉴以上众多译名,并综合Strachan 所做的分析和解释,经过反复比较和推敲,我们认为 VirtualReality 视情可译为“拟真技术”、“拟真”或“虚拟真实”。这几个译名与所介绍的那些在含义上大致相同而表述略有不同,或许是最为得体的。理由如下: 第一,这么译十分符合英文词语的本义。Virtual 在这里意为 in effect, though not in fact; not such infact but capable of being consider

(完整版)初级语法总结(标准日本语初级上下册)

1.名[场所]名[物/人]力*笳◎去歹/「仮歹 名[物/人]总名[场所]笳◎去歹/「仮歹 意思:“ ~有~ ”“~在~”,此语法句型重点在于助词“心‘ 例:部屋忙机力?笳◎去歹。 机总部屋 2.疑问词+哲+动(否定) 意思:表示全面否定 例:教室忙疋沁o 3?“壁①上”意思是墙壁上方的天棚,“壁才是指墙壁上 例:壁 4. ( 1)名[时间]动 表示动作发生的时间时,要在具体的时间词语后面加上助词“V”,这个一个时间点 例:森总心比7時V 起吉去To 注意:只有在包含具体数字的时间时后续助词“V”,比如“ 3月14 日V, 2008年V”;星期后面可加V,比如“月曜日V” ,也可以不加V;但是“今年、今、昨日、明日、来年、去年”等词后面不能加V。 此外:表示一定时间内进行若干次动作时,使用句型“名[时间]V 名[次数]+动” 例:李1週間V2回7°-^^行吉去T。 (2)名[时间段]动:说明动作、状态的持续时间,不能加“ V” 例:李毎日7時間働^^L^o (PS: “V”的更多用法总结请看初级上第15课) ( 3)名V 【用途】【基准】 表示用途时,前接说明用途的名词,后面一般是使"去T等动词 表示基准时,前名词是基准,后面一般是表示评价的形容词。 例:--乙①写真总何V使"去T力、。「用途」 --申請V使"去T。「用途」 乙①本总大人V易L^^ToL力'L、子供V总難L^^To 「基准」 X —X—力*近乙買⑴物V便利^To 「基准」 (4)动(基本形) OV 【用途】【基准】:使用与上述( 3)一样 例:乙O写真求一卜总申請T^OV 使"去T。 ^OV>^3>^ 買“物T^OV 便利^To (5)小句1 (简体形)OV,小句2:名/形動+肚+OV 表示在“小句1 ”的情况下发生“小句2”的情况不符合常识常理,翻译为“尽管…还是…,虽

软件破解入门教程

先教大家一些基础知识,学习破解其实是要和程序打交道的,汇编是破解程序的必备知识,但有可能部分朋友都没有学习过汇编语言,所以我就在这里叫大家一些简单实用的破解语句吧! ---------------------------------------------------------------------------------------------------------------- 语句:cmp a,b //cmp是比较的意思!在这里假如a=1,b=2 那么就是a与b比较大小. mov a,b //mov是赋值语句,把b的值赋给a. je/jz //就是相等就到指定位置(也叫跳转). jne/jnz //不相等就到指定位置. jmp //无条件跳转. jl/jb //若小于就跳. ja/jg //若大于就跳. jge //若大于等于就跳. 这里以一款LRC傻瓜编辑器为例,讲解一下软件的初步破解过程。大家只要认真看我的操作一定会!假如还是不明白的话提出难点帮你解决,还不行的话直接找我!有时间给你补节课!呵呵! 目标:LRC傻瓜编辑器杀杀杀~~~~~~~~~ 简介:本软件可以让你听完一首MP3歌曲,便可编辑完成一首LRC歌词。并且本软件自身还带有MP3音乐播放和LRC歌词播放功能,没注册的软件只能使用15天。 工具/原料 我们破解或给软件脱壳最常用的软件就是OD全名叫Ollydbg,界面如图: 它是一个功能很强大的工具,左上角是cpu窗口,分别是地址,机器码,汇编代码,注释;注释添加方便,而且还能即时显示函数的调用结果,返回值. 右上角是寄存器窗口,但不仅仅反映寄存器的状况,还有好多东东;双击即可改变Eflag的值,对于寄存器,指令执行后发生改变的寄存器会用红色突出显示. cpu窗口下面还有一个小窗口,显示当前操作改变的寄存器状态. 左下角是内存窗口.可以ascii或者unicode两种方式显示内存信息. 右下角的是当前堆栈情况,还有注释啊. 步骤/方法 1. 我们要想破解一个软件就是修改它的代码,我们要想在这代码的海洋里找到我们破解关键的代码确实很棘 手,所以我们必须找到一定的线索,一便我们顺藤摸瓜的找到我们想要的东东,现在的关键问题就是什么

(完整版)常见几种脱壳方法

----------------<小A分>---------------- 一、概论 壳出于程序作者想对程序资源压缩、注册保护的目的,把壳分为压缩壳和加密壳(强壳)两种 "UPX" "ASPCAK" "TELOCK" "PELITE" "NSPACK(北斗)" ... "ARMADILLO" "ASPROTECT" "ACPROTECT" "EPE(王)" "SVKP" ... 顾名思义,压缩壳只是为了减小程序体积对资源进行压缩,加密壳是程序输入表等等进行加密保护。 当然加密壳的保护能力要强得多! -----------<小A分割线>------------- 二、工具的认识 OllyDBG ring3 shell层级别的动态编译工具、 PEid、 ImportREC、 LordPE、 softIce ring0级别调试工具 -------------<小A分割>------------------- 三、常见手动脱壳方法 预备知识 1.PUSHAD (入栈/压栈)代表程序的入口点, 2.POPAD (弹栈/出栈)代表程序的出口点,与PUSHAD想对应,一般找到这个OEP就在附近 3.OEP:程序的入口点,软件加壳就是隐藏了OEP(或者用了假的OEP/FOEP),只要我们找到程序真正的OEP,就可以立刻脱壳。 ------------<小A分割线>-------------------- 方法一:单步跟踪法 1.用OD载入,点“不分析代码!” 2.单步向下跟踪F8,实现向下的跳。也就是说向上的跳不让其实现!(通过F4) 3.遇到程序往回跳的(包括循环),我们在下一句代码处按F4(或者右健单击代码,选择断点——>运行到所选) 4.绿色线条表示跳转没实现,不用理会,红色线条表示跳转已经实现! 5.如果刚载入程序,在附近就有一个CALL的,我们就F7跟进去,不然程序很容易跑飞,这样很快就能到程序的OEP 6.在跟踪的时候,如果运行到某个CALL程序就运行的,就在这个CALL中F7进入 7.一般有很大的跳转(大跨段),比如 jmp XXXXXX 或者 JE XXXXXX 或者有RETN 的一般很快就会到程序的OEP。 近CALL F7 远CALL F8 Btw:在有些壳无法向下跟踪的时候,我们可以在附近找到没有实现的大跳转,右键-->“跟随”,然后F2下断,Shift+F9运行停在“跟随”的位置,再取消断点,

标准日本语初级语法总结(上)

第一部分——名词 名词:名词是词性的一种,也是实词的一种,是指待人、物、事、时、地、情感、概念等实体或抽象事物的词。在日文中的充当句子成分时可以做主语,宾语,目的语等。与数量词、代词等构成体言。在单词方面完全由汉字组成的词汇如:説明せつめい 等很多是音读即类似汉语发音,当然也有训读的时候(海うみ ),甚至一个词有音读和训读两种念法(紅葉こうよう 和紅葉もみじ )。由汉字和假名组成的词语大多训读念法例:お知らせ 名词无变形但有时态的差别: 名词 简 体 敬 体 现在式 过去式 现在式 过去式 肯定 名词+だ 例:学生だ 名词+だった 学生だった 名词+です 学生です 名词+でした 学生でした 否定 名词+ではない 学生ではない 名词+ではなかった 学生ではなか った 名词+ではありません 学生ではありません 名词+ではありませんでした 学生ではありませんでした 另外名词的中顿形:名词+で、~~~。 名词的推量形,表示推测或向对方确认:でしょう。 初级上名词相关时态、句型: 敬体: 1.现在肯定:N です わたしは 王です。「3」2.现在否定:N ではありません わたしは 日本人では ありません。「3」 3.过去肯定:Nでした 前の 会社は 日系企業でした。「11」 4.过去否定:Nでは ありませんでした 先週,大阪は いい 天気では ありませんでした。「11」 简体: 1.现在肯定:Nだ 今日は 日曜日だ。「18」 2.现在否定:Nではない クリスマスは 祝日では ない。「18」 3.过去肯定:Nだった 恋人からの 誕生日の プレゼンは ネクタイだった。「18」 4.过去否定:Nではなかった 昨日は 休みでは なかった。「18」

脱壳或去皮的方法

脱壳或去皮的方法 一、脱壳 常用的脱壳有碾搓法、撞击法、剪切法及挤压法外,剥壳方法还有摩擦法。值得注意的是,任何剥壳机往往是一种剥壳方法为主而几种剥壳方法为辅的综合作用的结果。 摩擦法:利用摩擦形成的剪切力使皮壳沿其断裂而产生撕裂破坏,除下的皮壳整齐,便于选除,适用于韧性皮壳。 前述的脱壳法都是利用机械对农作物籽粒产生机械力的作用而实现脱壳的,以下是非机械式的脱壳方法。 能量法:利用籽粒在一个特殊环境中经受一定时间的高温高压作用,使得大量热量或气体聚集于籽粒壳内,并使籽粒内外达到气压平衡,然后让籽粒瞬间脱离高温高压环境,此时,聚集在籽粒壳与仁间的压力瞬时爆破,从而实现脱壳目的。 真空法:利用壳内外产生的压力差进行脱壳。它与能量法脱壳的不同之处在于它不是使气体进入籽粒壳内,而是在一定的范围内,在真空泵的抽吸作用下使壳外压力降低,壳内部处于相对较高压力状态,当压力差达到一定数值时,使外壳爆裂。一般采用单真空源与多个装料爆壳室相结合的配置。 微波爆壳法:利用电磁场的作用力对籽粒进行破壳。当微波作用于需脱壳的籽粒时,籽仁内水分子在交变电磁场的作用下将电磁能转化为热能。这种转变使籽仁在短时间内具有很高的能量,并迅速向外扩散,水分也沿着能量传递的方向迅速外迁,籽仁组织内部的部分结合水分转变为自由水分汽化逸出,导致籽仁失水而收缩。汽化逸出的自由水分以一定的压力作用于外壳,破坏了籽仁与外壳的贴合。同时,外壳在微波的作用下,组织内结合水分减少使纤维组织韧性下降、强度降低。由于籽仁与外壳在微波作用下的变形不一致,导致籽仁与外壳的分离,使脱壳成为可能。 二、去皮 用于去除果蔬表皮的去皮机,按去皮对象可分为: 1.块状根茎类原料去皮机; 2.果蔬去皮机。 除常用的机械去皮和化学去皮还有一些其他方法。 蒸汽加热去皮法:用于马铃薯、胡萝卜等块状茎类作物去皮前表面爆裂处理作业,特别适合外形凹凸不平且不规则的物料去皮。

高中英语 unit5 virtual reality-reading教案 上海牛津版S2A

Unit5 Virtual Reality Reading教案 一、章节分析(Reading section ) (一)综述 本章节讲述虚拟现实(VR)在各个领域的运用并分析其利弊。由于此主题较新,并与学生日常生活的电脑网络的使用有关,学生们对此应该是比较熟悉也颇有兴趣的。 因此,教师应充分利用学生的兴趣来教授本课,并进行适当拓展。 本课的任务有两个: 1.学生通过对课文的学习。掌握一些核心词汇,例如:imaginary, realistic, security, image等。 2.通过学习课文,了解如何运用想象写说明文,为writing部分做一定的铺垫和 准备。 (二)阅读目标 1知识目标 学习课文中重点词、词组、句型和语法。 2能力目标 通过阅读了解虚拟现实在各个领域的使用以及其他相关知识。 3情感目标 正确判断电脑、网络以及虚拟现实对日常生活的利弊影响。 (三)教学方法 采用任务型教学法组织教学,通过听说,讨论等具体活动,达到教学效果。 (四)重点和难点 1词汇学习 1)核心词汇 artificial

commit imaginary realistic security image amazing 2)拓展词汇 inspect manufacture architect amazing glide head-set inspect medium virtual fantasy 3)词组和短语 look down upon go back in time come true thanks to introduce … into reach out just for entertainment before long

标准日本语初级上册语法总结(改:动词原形)

日语初级上册语法总结 ㈠日语常用的词汇分类及用法: 1 名词:在句子中作主语,谓语,宾语,定语(名词+の名词)。 2 形容词:定语,谓语。 3 形容动词:定语,谓语。 4 动词:定语,谓语。 5 副词:可做状语,修饰动词,形容词,形容动词。 6 助词:相当中文里的助词,用于说明一个句子或一个词,与其它句子或词的关系。 ㈡动词的分类及「て形」、「ない形」、「た形」的变形规则。 ㈣上册所学语法中与「て」「ない」「た」相关的语法。 ①「て」:~ています、~てから、~てもいいです、~てください、~てはいけません、~ても 补充:动词的「て」形表示动作的先后顺序,以及动作行为的方式方法。 例えば:顔を洗って学校へ行きます。 歩いて駅へ行きます。 ②「ない」:~なくてもいいです、~ないでください、~ないほうがいいです、~なければならない 补充:动词的「ない」形表示否定。 例えば:会社へ行かない。 ③「た」:~たことがあります、~たほうがいいです、~たり~たりします、~たら 补充:动词的「た」形表示过去时。 例えば:フランスへ行った。

㈢名词,形容词,形容动词,动词的简体及敬体变形 ㈤常见助词用法的归纳总结。 1「は」用与提示主语,像「には、では、へは、からは、までは等」属于助词的重叠使用。 起加强语气或提示为主题的作用。 例えば:田中さんは日本人です。 教室には学生がいます。 2「が」提示主语和描述状态的作用。 常用「が」的情况有:1其后为形容词。 2表示自然现象。 3其前为疑问词。 4整句中的一小部分的主语。 5另外自动词前也用「が」来提示而不用「を」。 例えば:天気がいいです。 空は青いです。 誰がいますか。 私は足が痛い。 電気が付いている。 3「も」表示后项事物和前项事物一样。相当于中文的[也]。 例えば:陳さんは中国人です。 李さんも中国人です。

新手入门学习——脱壳破解练习第一期

新手入门学习——脱壳破解练习第一期拿到一个软件,先看看是加的什么壳。用PEiD查得【Upack V0.37-V0.39 -> Dwing *】 接着拿出破解第一要物OD进行脱壳(注意选好点的版本,由于Upack壳做了变形,有些版本的OD打开时会出错,最好是用英文版的) OD载入…… 我使用简单一点的ESP定律。 00401018 > BE B0114000 MOV ESI,脱壳破解.004011B0 0040101D AD LODS DWORD PTR DS:[ESI] 0040101E 50 PUSH EAX 0040101F FF76 34 PUSH DWORD PTR DS:[ESI+34] 00401022 EB 7C JMP SHORT 脱壳破解.004010A0 F8前进到【0040101F】,在寄存器窗口ESP处点右键,在数据窗口跟随【0012FFC0】 如下图:

//数据窗口点右键,下硬件断点。shift+F9运行 0012FFC0 004011B8 脱壳破解.004011B8 0012FFC4 7C82F23B 返回到 kernel32.7C82F23B 0012FFC8 00000000 //来到OEP,通过观察可以发现,这是一个典型的VB入口。这里记得要删除硬件断点!004011B8 68 24184000 PUSH 脱壳破解.00401824 //OEP 004011BD E8 EEFFFFFF CALL 脱壳破解.004011B0 004011C2 0000 ADD BYTE PTR DS:[EAX],AL 004011C4 0000 ADD BYTE PTR DS:[EAX],AL 004011C6 0000 ADD BYTE PTR DS:[EAX],AL 004011C8 3000 XOR BYTE PTR DS:[EAX],AL 记录新入口地址【11B8】

Unit 4 Cyberspace Lesson 3 Virtual Reality 教学设计3-优质公开课-北师大必修2精品

Unit 4 Cyberspace Lesson 3 Virtual Reality Teaching aims: To understand the dialogue. To get specific information about the virtual reality holidays. To voice your opinion on a virtual university. Teaching difficulties: How to make the students take part in the class actively. Before the class. You are divided into two groups: Group A and Group B. If some student in your group volunteersto answer the question, your group will get a smile face. The group that gets more smile faces willbe a winner. OK? Teaching procedures: Ⅰ. Lead in First,pleaselookatthreegroupsofpicturestodecide:whichoneshowsrealsituation? Which one shows unreal situation?Real/ Unreal They are all unreal situations, but they make us feel as if we are in real situations. Do youthink so? They are called virtual reality. Today we will read a passage---Lesson3 Virtual Reality Please look at objectives by yourselves. Do you understand? ⅡReading Task1. Skip to get main idea. Try to answer the two questions. 3’ 1. How many people are there in the dialogue? Who are they?

新标准日本语语法点总结(初级上)

新標準日本語語法點總結(初級上) 1.名【场所】名【物/人】力* / 庭忙何力?笳◎去T力、。 部屋忙誰力"、去T力、。 2.名【物/人】总名【场所】笳◎去T /「岷T 図書館乙忙笳◎去T力、。 猫总椅子 3. 表示方位 上*9^隣 下L尢中忌 力、 前外 後6 九勺 4.疑问词+哲+动词(否定)教室V誰去乜人。 冷蔵庫忙何哲笳◎去乜人。 5.去T 去乜人 6.时间的表达方式 今何時何分IT力、。 今四時三十分IT。 毎日何時忙寝去T力、。 11時3 0分忙寝去T。 (叙述包含数字的时间后续助词V,例3月14日V, 但是今日、今、昨日、明日、毎日、去年等词后不加V,星期后一般加V,但也可以不加。午前中 試験总始去◎去T力、。 来週①木曜日IT。 (詢問時間用 ^^,當詢問的時間很具體時,在表示時間的詞語后加V,如何時V、何曜日V、何日V )力、5……表示某动作发生在某个期间,也可以表示某移動動作的起點和終點森月曜日力、5水曜日去疋休注5。

表達時刻【何時何分】 1點一時7點七時 2點二時8點八時 3點三時9點九時

OD 新人入门基础教材

OD 新人入门基础教材 OD 是个好工具但很多新哥们不懂用可惜咯 首先我介绍下 OD正个界面: 各个窗口的功能如上图。简单解释一下各个窗口的功能,更详细的内容可以参考 TT 小组翻译的中文帮助: 反汇编窗口:显示被调试程序的反汇编代码,标题栏上的地址、HEX 数据、反汇编、注释可以通过在窗口中右击出现的菜单界面选项->隐藏标题或显示标题来进行切换是否显示。用鼠标左键点击注释标签可以切换注释显示的方式。 寄存器窗口:显示当前所选线程的 CPU 寄存器内容。同样点击标签寄存器 (FPU) 可以切换显示寄存器的方式。 信息窗口:显示反汇编窗口中选中的第一个命令的参数及一些跳转目标地址、字串等。 数据窗口:显示内存或文件的内容。右键菜单可用于切换显示方式。 堆栈窗口:显示当前线程的堆栈。

要调整上面各个窗口的大小的话,只需左键按住边框拖动,等调整好了,重新启动一下 OllyDBG 就可以生效了。 启动后我们要把插件及 UDD 的目录配置为绝对路径,点击菜单上的选项->界面,将会出来一个界面选项的对话框,我们点击其中的目录标签: 因为我这里是把 OllyDBG 解压在 F:\OllyDBG 目录下,所以相应的 UDD 目录及插件目录按图上配置。还有一个常用到的标签就是上图后面那个字体,在这里你可以更改 OllyDBG 中显示的字体。上图中其它的选项可以保留为默认,若有需要也可以自己修改。修改完以后点击确定,弹出一个对话框,说我们更改了插件路径,要重新启动 OllyDBG。在这个对话框上点确定,重新启动一下 OllyDBG,我们再到界面选项中看一下,会发现我们原先设置好的路径都已保存了。有人可能知道插件的作用,但对那个 UDD 目录不清楚。我这简单解释一下:这个 UDD 目录的作用是保存你调试的工作。比如你调试一个软件,设置了断点,添加了注释,一次没做完,这时 OllyDBG 就会把你所做的工作保存到这个 UDD 目录,以便你下次调试时可以继续以前的工作。如果不设置这个 UDD 目录,OllyDBG 默认是在其安装目录下保存这些后缀名为 udd 的文件,时间长了就会显的很乱,所以还是建议专门设置一个目录来保存这些文件。 另外一个重要的选项就是调试选项,可通过菜单选项->调试设置来配置:

北师大版高中英语必修二 unit4 lesson3 Virtual Reality reading

Unit4 Cyberspace-Lesson3 Virtual Reality 一、学习目标 1.透彻理解课文内容。 2.对新学到的单词和句型等能学以致用。 3.学习条件状语从句。 二、重难点分析 1.对课文里长难语句的理解以及句子成分分析。 2.学习条件状语从句。 三、学习过程 Step 1: 预习课文 1.用15分钟阅读,粗通课文的大意,回答下面的问题。 (1)What’s the main idea of the text? A. The weekend life of Cathy and Tom. B. The life that students hope for living in the future C. The imagination of virtual reality’s holiday and study situation D. Imagination of how to live in the future (2)Which of the following is true? A. In the virtual reality holiday, we wouldn’t experience something directly. B. Tome will go camping with his families this weekend. C. Tom will send the website address to Cathy when he gets to school. D. It’s not hard for us to imagine the virtual reality studying situation. (3)From the text, we can infer that________ . A. Cathy has lots of work to do this weekend. B. In the future, we will not have to go to our destinations in the flesh at all. C. We will not only be able to travel around the world, but go to study in world famous universities we wanted to. D. Cathy dislikes to spend a long time traveling on planes during holiday.

新版标准日本语初级上册语法总结

新版标准日语初级上册语法总结 ㈠日语常用的词汇分类及用法: 1 名词:在句子中作主语,谓语,宾语,定语(名词+の名词)。 2 形容词:定语,谓语。 3 形容动词:定语,谓语。 4 动词:定语,谓语。 5 副词:可做状语,修饰动词,形容词,形容动词。 6 助词:相当中文里的助词,用于说明一个句子或一个词,与其它句子或词的关系。㈡动词的分类及[ます形]「て形」、「ない形」、「た形」的变形规则。 动词的分类:ます形:て形: ない形: た形: ㈢名词,形容词,形容动词,动词的简体及敬体变形 ㈣上册所学语法中与「て」「ない」「た」相关的语法。 ㈤常见助词用法的归纳总结。 ㈥连词:连接句子于句子的词。 ㈦疑问词: ㈧副詞及接续词: 动词的分类: 动词「て形」「た形」的变形规则: 1、一类动词: ①动词的最后一个假名以「うつる」结尾时,将它们改为「って」「た」 買う買って 立つ立って 終わる終わって

②动词的最后一个假名以「むすぶ」结尾时,将它们改为「んで」「た」 読む読んで 遊ぶ遊んで 死ぬ死んで ③动词的最后一个假名以「くぐ」结尾时,将它改为「いて」「た」 書く書いて 泳ぎぐ泳いで ④行く行って「た」 ⑤話す話して「た」 2、二类动词:直接去掉加「て」「た」 食べる食べて出かける出かけて 鍛える鍛えて起きる起きて 3、三类动词:直接去掉「する」加「して」「た」。「来るー来(き)て」「た」。運動する運動して復習する復習して 買い物する買い物してチェックするチェックして 动词「ない形」的变形规则: 1、一类动词:将动词「ます形」的最后一个假名改为其「あ」段假名。若动词「ます形」的最后一个假名以「い」结尾时不要将其改为「あ」,而要改为「わ」。 買う買わない 立つ立たない 読む読まない

壳的介绍以及是常用脱壳方法

一、概论 壳出于程序作者想对程序资源压缩、注册保护的目的,把壳分为压缩壳和加密壳两种 UPX ASPCAK TELOCK PELITE NSPACK ... ARMADILLO ASPROTECT ACPROTECT EPE SVKP ... 顾名思义,压缩壳只是为了减小程序体积对资源进行压缩,加密壳是程序输入表等等进行加密保护。当然加密壳的保护能力要强得多! 二、常见脱壳方法 预备知识 1.PUSHAD (压栈)代表程序的入口点, 2.POPAD (出栈)代表程序的出口点,与PUSHAD想对应,一般找到这个OEP就在附近 3.OEP:程序的入口点,软件加壳就是隐藏了OEP(或者用了假的OEP/FOEP),只要我们找到程序真正的OEP,就可以立刻脱壳。 方法一:单步跟踪法 1.用OD载入,点“不分析代码!” 2.单步向下跟踪F8,实现向下的跳。也就是说向上的跳不让其实现!(通过F4) 3.遇到程序往回跳的(包括循环),我们在下一句代码处按F4(或者右健单击代码,选择断点——>运行到所选) 4.绿色线条表示跳转没实现,不用理会,红色线条表示跳转已经实现! 5.如果刚载入程序,在附近就有一个CALL的,我们就F7跟进去,不然程序很容易跑飞,这样很快就能到程序的OEP 6.在跟踪的时候,如果运行到某个CALL程序就运行的,就在这个CALL中F7进入 7.一般有很大的跳转(大跨段),比如jmp XXXXXX 或者JE XXXXXX 或者有RETN的一般很快就会到程序的OEP。 在有些壳无法向下跟踪的时候,我们可以在附近找到没有实现的大跳转,右键-->“跟随”,然后F2下断,Shift+F9运行停在“跟随”的位置,再取消断点,继续F8单步跟踪。一般情况下可以轻松到达OEP! 方法二:ESP定律法 ESP定理脱壳(ESP在OD的寄存器中,我们只要在命令行下ESP的硬件访问断点,就会一下来到程序的OEP了!) 1.开始就点F8,注意观察OD右上角的寄存器中ESP有没突现(变成红色)。(这只是一般情况下,更确切的说我们选择的ESP值是关键句之后的第一个ESP值) 2.在命令行下:dd XXXXXXXX(指在当前代码中的ESP地址,或者是hr XXXXXXXX),按回车! 3.选中下断的地址,断点--->硬件访--->WORD断点。 4.按一下F9运行程序,直接来到了跳转处,按下F8,到达程序OEP。 方法三:内存镜像法

新版标准日本语初级上册语法解释 第2课

新版标日初级·语法解释 第2课 1.これ/それ/あれは [名]です 相当于汉语“这是/那是~”。 “これ”“それ”“あれ”是指代事物的词,相当于汉语“这、这个”“那、那个”。用法如下: (1)说话人与听话人有一点距离,面对面时: ·これ:距离说话人较近的事物 ·それ:距离听话人较近的事物 ·あれ:距离说话人和听话人都较远的事物 (2)说话人和听话人处于同一位置,面向同一方向时: ·これ:距离说话人、听话人较近的事物 ·それ:距离说话人、听话人较远的事物 ·あれ:距离说话人、听话人更远的事物 例:これは 本です。 それは テレビです。 あれは パソコンですか。 2.だれですか/何ですか 相当于汉语“~是什么?/~是谁?”。不知道是什么人是用“だれ”,不知道是什么东西时用“何”。句尾后续助词“か”,读升调。例:それは 何ですか。 あの人は だれですか。 注意:“だれ”的礼貌说法是“どなた”。对方与自己是同辈、地位相当或地位较低时用“だれ”。对方比自己年长或地位高时用“どなた”。 例:吉田さんは どなたですか。 3.[名]の[名]【所属】 助词“の”连接名词和名词,表示所属。 例:私のかぎ。 小野さんの傘。 4.この/その/あの[名]は [名]です 相当于汉语“这个/那个~是~”。修饰名词时,要用“この”“その”“あの”。其表示的位置关系与“これ”“それ”“あれ”相同。例:このカメラは 私のです。 その傘は 小野さんのです。 あの車は だれのですか。 5.どれ/どの[名] 三个以上的事物中,不能确定哪一个时用疑问词“どれ”“どの”。单独使用时用“どれ”,修饰名词时用“どの”。 例:森さんのかばんは どれですか。 長島さんの靴は どれですか。 私の机は どの机ですか 扩展:100以下数字 0 れい/ぜろ 1 いち 2 に 3 さん 4 し/よん  5 ご 6 ろく

第一部分 基础知识及运用 知识点1 语音 教案

第一部分基础知识及运用 知识点1 语音 【教学目标】 1.依据教学大纲,了解《考试说明》对此项考查的要求。 2.通过分析高考试题,寻求规律,掌握考查的重点。 3.加强备考方法的指导,强化基础训练。 【教学方法】 讲练结合,注重积累。 【教学重点难点】 重点:复习方法指导; 强化语音识记。 难点:学习态度; 习惯性误读字和生僻字的识记。 【课时安排】 六课时。其中知识点1语音以及知识点精练的讲解需要二课时;知识清单“容易读错的字”讲解需要二个课时;同步检测语音部分考试及讲解二课时。 【教学过程和步骤】 一、导入 对口高考语文复习方法 (一)分析学生现状 语文基础不是很好;要做好打持久战的准备。 (二)语文复习方法 1.语文复习要注重教材,重点是教材中的字词和应用文写作; 2.语文知识要及时去记、去背;及时扫除知识障碍; 3.语文知识重在积累,重在坚持不懈,持之以恒; 4.多做题目,特别是阅读题,要勤于动笔; 5.提纲挈领,总体把握每一个知识点的内容。 二、讲解新课 (一)本知识点包括以下几方面的内容 考纲解读;知识点精讲(例题解析——高考真题;方法指导);知识点精练。 (二)讲析新课 1.复习定向 所谓“识记”,就是识别和记忆。识记的对象是“现代汉语普通话的字音”。普通话是中华民族的母语,正确“识记现代汉语普通话的字音”,是考生必须具备的能力。 我国自1977年恢复高考以来,语音题一直是高考必考题型。1996年后语音题所考内容及题型开始趋于稳定,要求相当明确,即只考“识记现代汉语普通话的字音”。 2.指导学生阅读“知识点1语音”(复习用书P1——4页) (1)学生看并做高考语音题。 (2)寻找高考总趋向:近年来高考对语音的考查,难度有所降低,据统计,1996年语音题考12字,超出常用字的5个,1997年超出4个,1998年和2001年、2002年全部是常

脱壳找OEP7种基本方法

脱壳找OEP7种基本方法 七种找OEP的方法 OD手动脱壳的7种法则 方法一:单步跟踪法 1.用OD载入,点“不分析代码!” 2.单步向下跟踪F8,实现向下的跳。也就是说向上的跳不让其实现!(通过F4) 3.遇到程序往回跳的(包括循环),我们在下一句代码处按F4(或者右健单击代码,选择断点——>运行到所选) 4.绿色线条表示跳转没实现,不用理会,红色线条表示跳转已经实现! 5.如果刚载入程序,在附近就有一个CALL的,我们就F7跟进去,不然程序很容易跑飞,这样很快就能到程序的OEP 6.在跟踪的时候,如果运行到某个CALL程序就运行的,就在这个CALL中F7进入 7.一般有很大的跳转(大跨段),比如jmp XXXXXX 或者JE XXXXXX 或者有RETN 的一般很快就会到程序的OEP。 PS:在有些壳无法向下跟踪的时候,我们可以在附近找到没有实现的大跳转,右键-->“跟随”,然后F2下断,Shift+F9运行停在“跟随”的位置,再取消断点,继续F8单步跟踪。一般情况下可以轻松到达OEP! 方法二:ESP定律法 ESP定理脱壳(ESP在OD的寄存器中,我们只要在命令行下ESP的硬件访问断点,就会一下来到程序的OEP了!) 1.开始就点F8,注意观察OD右上角的寄存器中ESP有没突变成红色。(这只是一般情况下,更确切的说我们选择的ESP值是关键句之后的第一个ESP值) 2.在命令行下:dd XXXXXXXX(指在当前代码中的ESP地址,或者是hr XXXXXXXX),按回车! 3.选中下断的地址,断点--->硬件访--->WORD断点。 4.按一下F9运行程序,直接来到了跳转处,按下F8,到达程序OEP。 方法三:内存镜像法 1:用OD打开软件! 2:点击选项——调试选项——异常,把里面的忽略全部√上!CTRL+F2重载下程序! 3:按ALT+M,打开内存镜象,找到程序的第一个.rsrc.按F2下断点,然后按SHIFT+F9运行到断点,接着再按ALT+M,打开内存镜象,找到程序的第一个.rsrc.上面的.CODE(也就是00401000处),按F2下断点!然后按SHIFT+F9(或者是在没异常情况下按F9),直接到达程序OEP! 方法四:一步到达OEP 1.开始按Ctrl+F,输入:popad(只适合少数壳,包括UPX,ASPACK壳),然后按下F2,F9运行到此处 2.来到大跳转处,点下F8,到达OEP!

Virtual Reality

Virtual Reality is a kind of computer simulation technique that is able to assist people experience virtual world by using special lens in head-sets. It can be applied in plenty of areas such as military training and medical science. For military training, with head-sets and some additional facilities, special situations like forests, desert and ruin can be virtually simulated. Then soldiers will feel that they are in real battlefield and complete tasks to improve themselves better. In terms of medical science, when facing dangerous operations, the data of patients’ tissues and organs can be input in the computer in advance. Then computer can create the structure of the patient in details. Surgeons wearing head-sets can practice on such simulation to get more familiar with the operation so as to handle unexpected dangerous condition well and avoid some mistakes. Virtual reality (VR) typically refers to computer technologies that use virtual reality headsets, sometimes in combination with physical spaces or multi-projected environments, to generate realistic images, sounds and other sensations that simulates a user's physical presence in a virtual or imaginary environment. A person using virtual reality equipment is able to "look around" the artificial world, and with high quality VR move about in it and interact with virtual features or items. VR headsets are head-mounted goggles with a screen in front of the eyes. Programs may include audio and sounds through speakers or headphones. VR systems that include transmission of vibrations and other sensations to the user through a game controller or other devices are known as haptic systems. This tactile information is generally known as force feedback in medical, video gaming and military training applications. Virtual reality also refers to remote communication environments which provide a virtual presence of users with through telepresence and telexistence or the use of a virtual artifact (VA). The immersive environment can be similar to the real world in order to create a lifelike experience grounded in reality or sci-fi. Augmented reality systems may also be considered a form of VR that layers virtual information over a live camera feed into a headset, or through a smart phone or tablet device.

相关文档