Topic Review
Softmod
A softmod is a method of using software to modify the intended behavior of hardware, such as video cards, sound cards, or game consoles in a way that can overcome restrictions of the firmware, or install custom firmware.
  • 6.0K
  • 23 Nov 2022
Topic Review
Mitsubishi Challenger
The Mitsubishi Challenger is a mid-size SUV produced by the Japanese manufacturer Mitsubishi Motors since 1996, spanning over three generations. Since 2015, for the third generation model, Mitsubishi has no longer used the Challenger name, but using the Pajero Sport/Montero Sport/Shogun Sport name instead.
  • 5.9K
  • 10 Oct 2022
Topic Review
Risk-Acceptance Criteria
The utilization of risk acceptance criteria (RAC) can help a business to judge whether the risk level concerning any process involved in its working environment is acceptable or not, especially when the risk has a significant societal impact.
  • 5.8K
  • 28 Mar 2022
Topic Review
SSC Aero
The SSC Ultimate Aero is a mid-engined sports car that was produced by SSC North America (formerly known as Shelby SuperCars) from 2004 until 2013. The SSC Ultimate Aero held the world production car speed record title, according to the Guinness World Records, from 2007 (when it was officially timed at 410 km/h) until the introduction of the Bugatti Veyron Super Sport in 2010. In April 2013, the Guinness World Records temporarily disqualified the Veyron's record time for a period of five days due to concerns about electronic speed limiting changing the function of the car, but after investigation reinstated the Veyron as the record holder. The SSC Ultimate Aero was not sold with electronic driver aids such as an anti-lock braking system or traction control system, as according to Jerod Shelby (no relations to Carroll Shelby), "Early design philosophy on the car was to make it a driver's car. I wanted a car that you not only throttled with your right foot but at times you could steer with your right foot and a sensor."
  • 5.8K
  • 04 Nov 2022
Topic Review
Digital Twin: Origin to Future
Digital Twin (DT) refers to the virtual copy or model of any physical entity (physical twin) both of which are interconnected via exchange of data in real time. Conceptually, a DT mimics the state of its physical twin in real time and vice versa. Application of DT includes real-time monitoring, designing/planning, optimization, maintenance, remote access, etc. Its implementation is expected to grow exponentially in the coming decades. The advent of Industry 4.0 has brought complex industrial systems that are more autonomous, smart, and highly interconnected. These systems generate considerable amounts of data useful for several applications such as improving performance, predictive maintenance, training, etc. A sudden influx in the number of publications related to ‘Digital Twin’ has led to confusion between different terminologies related to the digitalization of industries. Another problem that has arisen due to the growing popularity of DT is a lack of consensus on the description of DT as well as so many different types of DT, which adds to the confusion. This paper intends to consolidate the different types of DT and different definitions of DT throughout the literature for easy identification of DT from the rest of the complimentary terms such as ‘product avatar’, ‘digital thread’, ‘digital model’, and ‘digital shadow’. The paper looks at the concept of DT since its inception to its predicted future to realize the value it can bring to certain sectors. Understanding the characteristics and types of DT while weighing its pros and cons is essential for any researcher, business, or sector before investing in the technology.
  • 5.8K
  • 25 Jun 2021
Topic Review
New Literacies
New literacies generally are new forms of literacy made possible by digital technology developments, although new literacies do not necessarily have to involve use of digital technologies to be recognized as such. The term "new literacies" itself is relatively new within the field of literacy studies (the first documented mention of it in an academic article title dates to 1993 in a text by David Buckingham). Its definition remains open, with new literacies being conceptualized in different ways by different groups of scholars. For example, one group of scholars argues that literacy is now deictic, and see it as continually and rapidly changing as new technologies appear and new social practices for literacy emerge. (Leu, 2000). This group aims at developing a single, overarching theory to help explain new literacies (see, for example, Leu, O'Byrne, Zawilinski, McVerry, & Everett-Cacopardo, 2009; see also, below). This orientation towards new literacies is largely psycholinguistic in nature. Other groups of scholars follow a more sociocultural orientation that focuses on literacy as a social practice, which emphasizes the role of literacy with a range of socially patterned and goal-directed ways of getting things done in the world (see, for example, Gee & Hayes, 2012; Lankshear & Knobel, 2011; Kalantzis and Cope 2011). Accompanying the varying conceptualizations of new literacies, there are a range of terms used by different researchers when referring to new literacies, including 21st century literacies, internet literacies, digital literacies, new media literacies, multiliteracies, information literacy, ICT literacies, and computer literacy. In the Handbook of New Literacies Research, Coiro, Knobel, Lankshear, and Leu (2008) note that all these terms "are used to refer to phenomena we would see as falling broadly under a new literacies umbrella" (pg. 10). Commonly recognized examples of new literacies include such practices as instant messaging, blogging, maintaining a website, participating in online social networking spaces, creating and sharing music videos, podcasting and videocasting, photoshopping images and photo sharing, emailing, shopping online, digital storytelling, participating in online discussion lists, emailing and using online chat, conducting and collating online searches, reading, writing and commenting on fan fiction, collaborating on and writing encyclopedic wikis, processing and evaluating online information, creating and sharing digital mashups, etc. (see: Black, 2008; Coiro, 2003; Gee, 2007; Hunter, 2014; Jenkins, 2006; Kist, 2007; Lankshear & Knobel, 2006; Lessig, 2005; Leu, et al. 2004; Prensky, 2006).
  • 5.8K
  • 14 Oct 2022
Topic Review
Slope Stability Analysis
Slope stability analysis is performed to assess the safe design of a human-made or natural slopes (e.g. embankments, road cuts, open-pit mining, excavations, landfills etc.) and the equilibrium conditions. Slope stability is the resistance of inclined surface to failure by sliding or collapsing. The main objectives of slope stability analysis are finding endangered areas, investigation of potential failure mechanisms, determination of the slope sensitivity to different triggering mechanisms, designing of optimal slopes with regard to safety, reliability and economics, designing possible remedial measures, e.g. barriers and stabilization. Successful design of the slope requires geological information and site characteristics, e.g. properties of soil/rock mass, slope geometry, groundwater conditions, alternation of materials by faulting, joint or discontinuity systems, movements and tension in joints, earthquake activity etc. The presence of water has a detrimental effect on slope stability. Water pressure acting in the pore spaces, fractures or other discontinuities in the materials that make up the pit slope will reduce the strength of those materials. Choice of correct analysis technique depends on both site conditions and the potential mode of failure, with careful consideration being given to the varying strengths, weaknesses and limitations inherent in each methodology. Before the computer age stability analysis was performed graphically or by using a hand-held calculator. Today engineers have a lot of possibilities to use analysis software, ranges from simple limit equilibrium techniques through to computational limit analysis approaches (e.g. Finite element limit analysis, Discontinuity layout optimization) to complex and sophisticated numerical solutions (finite-/distinct-element codes). The engineer must fully understand limitations of each technique. For example, limit equilibrium is most commonly used and simple solution method, but it can become inadequate if the slope fails by complex mechanisms (e.g. internal deformation and brittle fracture, progressive creep, liquefaction of weaker soil layers, etc.). In these cases more sophisticated numerical modelling techniques should be utilised. Also, even for very simple slopes, the results obtained with typical limit equilibrium methods currently in use (Bishop, Spencer, etc.) may differ considerably. In addition, the use of the risk assessment concept is increasing today. Risk assessment is concerned with both the consequence of slope failure and the probability of failure (both require an understanding of the failure mechanism). Within the last decade (2003) Slope Stability Radar has been developed to remotely scan a rock slope to monitor the spatial deformation of the face. Small movements of a rough wall can be detected with sub-millimeter accuracy by using interferometry techniques.
  • 5.8K
  • 19 Oct 2022
Topic Review
Intel 80386
The Intel 80386, also known as i386 or just 386, is a 32-bit microprocessor introduced in 1985. The first versions had 275,000 transistors and were the CPU of many workstations and high-end personal computers of the time. As the original implementation of the 32-bit extension of the 80286 architecture, the 80386 instruction set, programming model, and binary encodings are still the common denominator for all 32-bit x86 processors, which is termed the i386-architecture, x86, or IA-32, depending on context. The 32-bit 80386 can correctly execute most code intended for the earlier 16-bit processors such as 8086 and 80286 that were ubiquitous in early PCs. (Following the same tradition, modern 64-bit x86 processors are able to run most programs written for older x86 CPUs, all the way back to the original 16-bit 8086 of 1978.) Over the years, successively newer implementations of the same architecture have become several hundreds of times faster than the original 80386 (and thousands of times faster than the 8086). A 33 MHz 80386 was reportedly measured to operate at about 11.4 MIPS. The 80386 was introduced in October 1985, while manufacturing of the chips in significant quantities commenced in June 1986. Mainboards for 80386-based computer systems were cumbersome and expensive at first, but manufacturing was rationalized upon the 80386's mainstream adoption. The first personal computer to make use of the 80386 was designed and manufactured by Compaq and marked the first time a fundamental component in the IBM PC compatible de facto standard was updated by a company other than IBM. In May 2006, Intel announced that 80386 production would stop at the end of September 2007. Although it had long been obsolete as a personal computer CPU, Intel and others had continued making the chip for embedded systems. Such systems using an 80386 or one of many derivatives are common in aerospace technology and electronic musical instruments, among others. Some mobile phones also used (later fully static CMOS variants of) the 80386 processor, such as BlackBerry 950 and Nokia 9000 Communicator. Linux continued to support 80386 processors until December 11, 2012; when the kernel cut 386-specific instructions in version 3.8.
  • 5.8K
  • 23 Oct 2022
Topic Review
US Airways Livery
US Airways' aircraft livery has varied both under the US Airways and USAir name. In general the Express and Shuttle divisions have had liveries that closely parallel the company-wide livery at the time. The US Airways livery has been replaced with the new American Airlines livery, in accordance with their merger.
  • 5.8K
  • 16 Nov 2022
Topic Review
Fluid Catalytic Cracking Catalyst Regeneration Intensification Technologies
Fluid catalytic cracking (FCC) is the workhorse of modern crude oil refinery. Its regenerator component plays a critical role in optimizing the overall profitability by efficiently restoring the catalyst activity and enhancing the heat balance in the riser reactor. Improvement in the device metallurgy and process operations have enabled industrial regenerators to operate at high temperatures with a better coke burning rate and longer operating cycle. Today, the carbon content of regenerated catalyst has drastically reduced to less than 0.1 wt.%.
  • 5.7K
  • 15 Mar 2022
  • Page
  • of
  • 677
Video Production Service