What are the conditions of Parallelism in Computer Architecture?

There are various conditions of Parallelism which are as follows −

  • Data and resource dependencies − A program is made up of several parts, so the ability to implement various program segments in parallel is needed that each segment should be autonomous regarding the other segment. Dependencies in various segments of a program may be in various forms like resource dependency, control depending & data depending.

    A dependence graph can define the relation. The program statements are defined by nodes and the directed edge with multiple labels displays the ordered relation among the statements. After analyzing the dependence graph, it can be demonstrated that where opportunity exists for parallelization & vectorization.

    Data Dependencies − Relation between statements is shown by data dependencies. There are 5 types of data dependencies that are as follows −

    • Antidependency − A statement S2 is antidependent on statement ST1 if ST2 follows ST1 in order and if the output of ST2 overlaps the input to ST1.
    • Input dependence − Read & write are input statement input dependence occur not because of same variables involved put because of the same file is referenced by both input statements.
    • Unknown dependence − The dependence relation between two statements cannot be found in the following methods
      • The subscript of the variable is itself subscribed.
      • The subscript does not have the loop index variable.
      • Subscript is nonlinear in the loop index variable.
    • Output dependence − Two statements are output dependence if they create a similar output variable.
    • Flow dependence − The statement ST2 flows dependent if a statement ST1, if an expression path exists from ST1 to ST2 and at least are the output of ST, feeds in input to ST2.
  • Software Parallelism − Software dependency is represented by the control and data dependency of programs. The degree of parallelism is disclosed in the program profile or program flow graph. Software parallelism is a function of the algorithm, programming style, and compiler optimization. Program flow graphs show the pattern of simultaneously executable operation. Parallelism in a program changes during the implementation period.

  • Hardware Parallelism − Hardware Parallelism is represented by hardware multiplicity & machine hardware. It is a function of cost & performance trade-off. It presents the resource application design of simultaneously executable operations. It also denotes the execution of the processor resources. One method of identifying parallelism in hardware is using several instructions issued per machine cycle.

Updated on: 30-Jul-2021

6K+ Views

Kickstart Your Career

Get certified by completing the course

Get Started