Topic Review
Root-Finding Algorithm
In mathematics and computing, a root-finding algorithm is an algorithm for finding zeroes, also called "roots", of continuous functions. A zero of a function f, from the real numbers to real numbers or from the complex numbers to the complex numbers, is a number x such that f(x) = 0. As, generally, the zeroes of a function cannot be computed exactly nor expressed in closed form, root-finding algorithms provide approximations to zeroes, expressed either as floating point numbers or as small isolating intervals, or disks for complex roots (an interval or disk output being equivalent to an approximate output together with an error bound). Solving an equation f(x) = g(x) is the same as finding the roots of the function h(x) = f(x) – g(x). Thus root-finding algorithms allow solving any equation defined by continuous functions. However, most root-finding algorithms do not guarantee that they will find all the roots; in particular, if such an algorithm does not find any root, that does not mean that no root exists. Most numerical root-finding methods use iteration, producing a sequence of numbers that hopefully converge towards the root as a limit. They require one or more initial guesses of the root as starting values, then each iteration of the algorithm produces a successively more accurate approximation to the root. Since the iteration must be stopped at some point these methods produce an approximation to the root, not an exact solution. Many methods compute subsequent values by evaluating an auxiliary function on the preceding values. The limit is thus a fixed point of the auxiliary function, which is chosen for having the roots of the original equation as fixed points, and for converging rapidly to these fixed points. The behaviour of general root-finding algorithms is studied in numerical analysis. However, for polynomials, root-finding study belongs generally to computer algebra, since algebraic properties of polynomials are fundamental for the most efficient algorithms. The efficiency of an algorithm may depend dramatically on the characteristics of the given functions. For example, many algorithms use the derivative of the input function, while others work on every continuous function. In general, numerical algorithms are not guaranteed to find all the roots of a function, so failing to find a root does not prove that there is no root. However, for polynomials, there are specific algorithms that use algebraic properties for certifying that no root is missed, and locating the roots in separate intervals (or disks for complex roots) that are small enough to ensure the convergence of numerical methods (typically Newton's method) to the unique root so located.
  • 9.6K
  • 14 Oct 2022
Topic Review
Fifth Generation Computer
The Fifth Generation Computer Systems (FGCS) was an initiative by Japan's Ministry of International Trade and Industry (MITI), begun in 1982, to create computers using massively parallel computing and logic programming. It was to be the result of a government/industry research project in Japan during the 1980s. It aimed to create an "epoch-making computer" with supercomputer-like performance and to provide a platform for future developments in artificial intelligence. There was also an unrelated Russian project also named as a fifth-generation computer (see Kronos (computer)). Ehud Shapiro, in his "Trip Report" paper (which focused the FGCS project on concurrent logic programming as the software foundation for the project), captured the rationale and motivations driving this project: The term "fifth generation" was intended to convey the system as being advanced. In the history of computing hardware, computers using vacuum tubes were called the first generation; transistors and diodes, the second; integrated circuits, the third; and those using microprocessors, the fourth. Whereas previous computer generations had focused on increasing the number of logic elements in a single CPU, the fifth generation, it was widely believed at the time, would instead turn to massive numbers of CPUs for added performance. The project was to create the computer over a ten-year period, after which it was considered ended and investment in a new "sixth generation" project would begin. Opinions about its outcome are divided: either it was a failure, or it was ahead of its time.
  • 9.2K
  • 21 Oct 2022
Topic Review
Big Data Mining
Big data mining (BDM) is an approach that uses the cumulative data mining or extraction techniques on large datasets / volumes of data. It is mainly focused on retrieving relevant and demanded information (or patterns) and thus extracting value hidden in data of an immense volume. BDM draws from the conventional data mining notation but also combines the aspects of big data, i.e. it enables to acquire useful information from databases or data streams that are huge in terms of “big data V’s”, like volume, velocity, and variety.
  • 5.7K
  • 05 Aug 2021
Topic Review
Feature Detection (Computer Vision)
In computer vision and image processing feature detection includes methods for computing abstractions of image information and making local decisions at every image point whether there is an image feature of a given type at that point or not. The resulting features will be subsets of the image domain, often in the form of isolated points, continuous curves or connected regions.
  • 4.5K
  • 10 Oct 2022
Topic Review
Timeline of Operating Systems
This article presents a timeline of events in the history of computer operating systems from 1951 to the current day. For a narrative explaining the overall developments, see the History of operating systems.
  • 4.1K
  • 04 Nov 2022
Topic Review
List of Instruction Sets
A list of computer central processor instruction sets: (By alphabetical order by its manufacturer.)
  • 3.6K
  • 04 Nov 2022
Topic Review
Digital Literacy
Digital literacy refers to an individual's ability to find, evaluate, and compose clear information through writing and other media on various digital platforms. Digital literacy is evaluated by an individual's grammar, composition, typing skills and ability to produce text, images, audio and designs using technology. While digital literacy initially focused on digital skills and stand-alone computers, the advent of the internet and use of social media, has caused some of its focus to shift to mobile devices. Similar to other expanding definitions of literacy that recognize cultural and historical ways of making meaning, digital literacy does not replace traditional forms of literacy, and instead builds upon and expands the skills that form the foundation of traditional forms of literacy. Digital literacy should be considered to be a part of the path to knowledge. Digital literacy is built on the expanding role of social science research in the field of literacy as well as on concepts of visual literacy, computer literacy, and information literacy. Overall, digital literacy shares many defining principles with other fields that use modifiers in front of literacy to define ways of being and domain specific knowledge or competence. The term has grown in popularity in education and higher education settings and is used in both international and national standards.
  • 2.7K
  • 31 Oct 2022
Topic Review
Chaotic Image Encryption
Chaos is the characteristic of a system whose current state is guaranteed to be highly sensitive to the previous state (spatial chaos), the initial conditions (temporal chaos), or both (spatio-temporal chaos). Such a sensitivity makes the output or the behavior of a chaotic system difficult to predict. Chaos theory justifies and formulates the apparent disorder of chaotic systems on the basis of orderly patterns, structured feedback loops, iterative repetitions, self-organization, self-similarity, fractals, etc. Chaotic maps, attractors, and sequences all refer to the mathematical structures used for this formulation. Chaotic systems, maps, attractors, and sequences have been of great interest to the research community. They have been used for security purposes in a broad variety of applications ranging from smart grids to communication systems. Especially, chaotic encryption has been used for encrypting a variety of content types in addition to images.
  • 2.3K
  • 27 Jul 2022
Topic Review
Windows NT
Windows NT is a proprietary graphical operating system produced by Microsoft, the first version of which was released on July 27, 1993. It is a processor-independent, multiprocessing and multi-user operating system. The first version of Windows NT was Windows NT 3.1 and was produced for workstations and server computers. It was a commercially focused operating system intended to complement consumer versions of Windows that were based on MS-DOS (including Windows 1.0 through Windows 3.1x). Gradually, the Windows NT family was expanded into Microsoft's general-purpose operating system product line for all personal computers, deprecating the Windows 9x family. "NT" was formerly expanded to "New Technology" but no longer carries any specific meaning. Starting with Windows 2000, "NT" was removed from the product name and is only included in the product version string along with several low-level places within the system. NT was the first purely 32-bit version of Windows, whereas its consumer-oriented counterparts, Windows 3.1x and Windows 9x, were 16-bit/32-bit hybrids. It is a multi-architecture operating system. Initially, it supported several instruction set architectures, including IA-32, MIPS, and DEC Alpha; support for PowerPC, Itanium, x64, and ARM were added later. The latest versions support x86 (including IA-32 and x64) and ARM. Major features of the Windows NT family include Windows Shell, Windows API, Native API, Active Directory, Group Policy, Hardware Abstraction Layer, NTFS, BitLocker, Windows Store, Windows Update, and Hyper-V.
  • 2.1K
  • 02 Nov 2022
Topic Review
Thread Block
A thread block is a programming abstraction that represents a group of threads that can be executed serially or in parallel. For better process and data mapping, threads are grouped into thread blocks. The number of threads varies with available shared memory. 'The number of threads in a thread block is also limited by the architecture to a total of 512 threads per block.' The threads in the same thread block run on the same stream processor. Threads in the same block can communicate with each other via shared memory, barrier synchronization or other synchronization primitives such as atomic operations. Multiple blocks are combined to form a grid. All the blocks in the same grid contain the same number of threads. Since the number of threads in a block is limited to 512, grids can be used for computations that require a large number of thread blocks to operate in parallel. CUDA is a parallel computing platform and programming model that higher level languages can use to exploit parallelism. In CUDA, the kernel is executed with the aid of threads. The thread is an abstract entity that represents the execution of the kernel. A kernel is a small program or a function. Multi threaded applications use many such threads that are running at the same time, to organize parallel computation. Every thread has an index, which is used for calculating memory address locations and also for taking control decisions.
  • 1.5K
  • 29 Nov 2022
  • Page
  • of
  • 14