What are the types of Parallelism in Computer Architecture?

Computer ScienceNetwork

There are various types of Parallelism in Computer Architecture which are as follows −

  • Available and Utilized Parallelism

Parallelism is the most important topics in computing. Architectures, compilers, and operating frameworks have been striving for more than two decades to extract and use as much parallelism as available to speed up computation.

  • Available and Utilized Parallelism

Parallelism is the most important topics in computing. Architectures, compilers, and operating frameworks have been striving for more than two decades to extract and use as much parallelism as available to speed up computation.

Functional Parallelism is that type of parallelism that raise from the logic of a problem solution. It occurs in all formal descriptions of problem solutions, such as program flow diagrams, dataflow graphs, programs, and so on to a greater or lesser extent.

Data Parallelism is inherent only in a restricted set of problems, such as scientific or engineering calculations or image processing. This type of parallelism provides growth to a massively parallel execution for the data-parallel element of the computation.

  • Levels of the available functional parallelism

Programs written in imperative languages can represent functional parallelism at various levels that are, at multiple sizes of granularity. In this method we can recognize the following four levels and corresponding granularity sizes −

  • Instructional level parallelism represents that specific instruction of a program can be implemented in parallel. Instructions can be assembly (machine-level) or high-level language instructions.

  • Parallelism can also be accessible at the loop level. Therefore consecutive loop iterations are an applicant for parallel execution. Data dependencies between subsequent loop iterations known as recurrences can shorten their parallel execution.

  • Parallelism is accessible at the procedure level in the form of parallel executable methods.

  • Parallelism is also accessible at the user level which we treated to be coarse-grained parallelism.

  • Utilization of Functional Parallelism

The available parallelism can be utilized by architectures, compilers, and operating frameworks related to speeding up computation. Let us first consider the utilization of functional parallelism. Functional parallelism can be used at four different levels of granularity, such as instruction, thread, process, and user level.

It is fully ordinary to utilize available functional parallelism, which is basic in a conventional sequential program, at the instruction level by implementing instructions in parallel. This can be managed using architectures adequate of parallel instruction execution. Such architectures are defined as instruction-level functional-parallel architectures or simply instruction-level parallel architectures, frequently abbreviated as ILP-architectures.

  • Concurrent Execution Models

Thread-level concurrent execution is defined as multi-threading. In this method, multithreading can be created for each process, and these threads are implemented together on an individual processor under the supervision of the operating framework.

Multithreading is generally interpreted as concurrent execution at the thread level. Multithreading pretends that a process has multiple threads, that is, a process thread model can describe and schedule units of work for the processor.

Process-level concurrent execution is generally known as multitasking. Multitasking defines the concurrent execution of processes. Multiple ready-to-run processes can be generated either by an individual user if process reproduction is feasible, or by multiple users performing in multiprogramming or in time-sharing.

  • Utilization of Data Parallelism

Data parallelism can be utilized in two different methods as one circumstance is to exploit data parallelism precisely by dedicated architectures that allow parallel or pipelined operations on data components known as data-parallel architectures.

raja
Published on 20-Jul-2021 07:01:30
Advertisements