1. Wear Mechanism of Bucket Teeth
There are numerous complex factors that affect the research on the wear of construction machinery’s wear parts. These factors interact with each other, further complicating the issue of wear on these parts. Therefore, in order to address the wear problems associated with construction machinery’s wear parts, it is crucial to first comprehend the wear mechanism and different forms of wear exhibited by various components. Jiang 
et al. conducted extensive experimental studies to analyze the progression and manifestations of wear on loader pins. Their research revealed that the wear process of these pins can be primarily categorized into three stages: grinding, abrasive wear, and bonded wear. Each stage exhibits varying degrees of wear and distinct characteristics. Bucket teeth, being a vital component of construction machinery, are typically positioned at the forefront of excavator and loader buckets. These teeth come into direct contact with ores, gravel, and other materials, making them highly susceptible to significant and intricate wear. The wear of bucket teeth poses a complex challenge due to the diverse working environments and varying material contacts. Consequently, each part of the bucket teeth exhibits distinct forms of wear. After conducting an analysis, the researchers have established the wear mechanism of each part of the bucket teeth, as illustrated in Figure 1
Figure 1. Tooth wear diagram of bucket.
The bucket teeth are a crucial component of the cantilever beam member in excavators. It comprises the shovel head, shovel seat, and ring clips, and its degree of wear and fracture directly affects the quality and efficiency of extraction. Bucket teeth come in different forms (Figure 2
), including rock teeth, earth and rock square teeth, conical teeth, bucket teeth, and others 
. The most common bucket tooth type is the conical tooth, located at the front of construction machinery, in direct contact with materials. This causes significant wear and varies in form. During excavation, the tip of the bucket teeth bear an impact load when inserted into materials, resulting in impact wear. As the bucket teeth deepen, the material above increases, causing relative sliding and two-body abrasive wear on the bucket teeth. As the bucket teeth deepen, materials roll along their surfaces into the bucket. The fine material’s gravity is negligible, and pressure on the bucket teeth is minimal, causing three-body abrasive wear 
. The unloading of the excavator and loader bucket also experience three-body abrasive wear. Fretting wear has complex sources, including environmental vibrations and alternating stresses.
Figure 2. Different forms of bucket teeth.
2. Impact Wear
Impact wear 
is a unique type of wear that is often observed in engineering, resulting from a combination of processes such as impact and sliding friction wear. This wear type occurs on the surface of the part that keeps the abrasive material crumbling. In impact wear, concentrated compressive stress is present at the abrasive contact, the plastic rheology, and fatigue of the ductile phase on the metal surface, while the hard phase is fractured. This indicates that the stress on the material has exceeded the crushing strength of the abrasive, making it high-stress wear 
. When inserted and excavated into a mine, bucket teeth are subjected to strong impact loads, resulting in chiseled abrasive wear. The abrasive rock grains move rapidly on the metal surface of the bucket teeth, with their sharp edges resembling knives cutting the tooth surface. This action causes plastic deformation and forms plastic change grooves. The magnitude of the cutting resistance is determined by factors such as the nature and state of the ore rock, the geometry of the cutting part, the cutting angle, and the cutting thickness 
. Figure 3
illustrates the working environment of construction machinery engaged in excavating materials under conditions of impact wear.
Figure 3. Working conditions with impact wear governed.
Rice et al. 
investigated the pure impact wear behavior of different impact contact subs in a dry state interface environment with different impact contact forces and different impact frequency operating conditions, as well as the punch-cut composite impact wear behavior with different tangential velocities. Engel et al. 
investigated the composite impact wear accompanied by sliding during impact and found that there is a zero wear period during the impact wear process. This is due to the time required for the sprouting, expansion, and fracture of fatigue cracks. Yang Yi 
investigated the impact wear behavior of Fe-Mn-Al-C lightweight high manganese steel by selecting different test conditions and setting different rotational speeds on the specimens to compare and analyze the impact wear mechanism. Zhang et al. 
studied the wear mechanism and wear mechanism of Mn13Cr2 high manganese steel by using an impact abrasive wear testing machine and other instrumentation to simulate an actual working environment; a large number of slip bands appeared in high manganese steel after impact, and the density of slip bands increased with the increase in impact work.
3. Abrasive Wear
Abrasive wear is the most prevalent type of wear in construction machinery and accounts for a significant portion of overall wear. Yan 
et al. focused on the pin set of the ZL50 series loader and conducted extensive sample observations. Through analyzing the loader pin’s failure history, they determined that the primary form of wear on the pin is abrasive wear. Throughout the entire excavation process of construction machinery, abrasive wear persists for the longest duration.
There are several approaches to classify abrasive wear. Avery 
classified it into chisel wear, high stress wear, and low stress or erosion wear, based on the stresses the wear parts undergo. Burwell 
classified it into two-body abrasive wear and three-body abrasive wear, depending on the involvement of abrasive grains during wear. Two-body abrasive wear occurs when a hard surface scratches a softer surface during frictional motion, while three-body abrasive wear happens when an abrasive grain is caught between two surfaces and causes wear on one or both surfaces. Misra and Finniel 
further refined the classification of abrasive wear in 1980. Gates 
concluded that abrasive wear should be classified based on the amount of stress on the wear and the form of movement of the abrasive. The evolution of the classification of abrasive wear indicates its complexity.
Typically, the bucket scoops from the bottom upwards, and the bucket teeth slowly penetrate the material, resulting in relative sliding between the two. The weight of the material exerts pressure on the bucket teeth, causing sliding friction wear, which is a type of two-body abrasive wear. When discharging, the bucket is tilted downward, and the material rolls out of the bucket (Figure 4). At this point, there is minimal contact between the material and the bucket teeth, resulting in rolling friction wear, which is a type of three-body abrasive wear.
Figure 4. Working conditions with abrasive wear governed.
4. Fretting Wear
In the case of the two objects mentioned above, the contact surfaces experience mutual pressure and remain stationary. However, slight periodic vibrations or alternating stress in the environment cause small reciprocal sliding between the surfaces, leading to wear or motion vice during the non-running period. Unfortunately, people are often unaware of this type of wear due to the effects of environmental vibration and alternating stress, and it is therefore often overlooked. This type of wear is commonly referred to as fretting wear.
The phenomenon of fretting was first discovered by Eden in 1911, but did not attract attention until 1927, when Tomlinson 
designed equipment to study the process of fretting and coined the term “fretting corrosion”. With more research, the phenomenon of fatigue fretting was discovered, and it was noted that it could accelerate fatigue damage. In their study on fretting wear mechanisms, Godfrey et al. 
found that mechanical action is the primary factor causing wear on material surfaces, while oxidation is a secondary factor. When fixed bonding surfaces experience oxidation and adhesion, an abrasive chip (a third body) forms between the contact surfaces. From a different perspective, Godet et al. 
put forth the fretting triplet theory based on earlier research. According to this theory, adhesion, plastic deformation, surface hardening, particle exfoliation, and the formation of abrasive chips on the contact surface occur due to continuous oxidation reactions. Zhang et al. 
studied the impact of tangential force on micro-action fatigue and discovered that wear depth increases gradually with increasing tangential force. Moreover, higher tangential force reduces micro-action fatigue life and affects the expansion of fatigue cracks. Figure 5
displays several fundamental forms of micro-motion on the bucket teeth of construction machinery.
The basic form of micro-movement of the material on the bucket teeth 
Fretting wear causes plastic deformation and cracks in micro-convex bodies on friction subjoint surfaces due to contact pressure 
. Additionally, the oxide or lubricating film on the contact surface is destroyed, leading to weld adhesion and knot formation between surfaces 
. During fretting wear, chemical activity plays a significant role in the formation of oxide chips on the shear-off bonding points and exposed nascent surfaces. The interaction of oxygen with these surfaces leads to gradual oxidation. The generated oxide chips can cause abrasive wear and contact zone fatigue, especially due to the small amplitude, low relative velocity of sliding, and close fitting of the surface 
. These oxide chips are not easily unloaded or dislodged from the contact area, so they act as abrasives during abrasive wear.
5. Wear Morphology
The surface locations of bucket teeth exhibit diverse forms of wear, leading to distinct wear profiles for each component of the bucket teeth. In the case of mining excavator bucket teeth, the majority of them exhibit high-stress wear across all surfaces. This wear is predominantly characterized by micro-cutting and plastic pear grooves, which are classified as abrasive wear and dominate the overall wear of the bucket teeth 
. The formation of surface cracks caused by this wear leads to the accumulation of materials such as rocks on the teeth, which in turn causes Ca, O, K, Na, Si, and Al elements from sand and gravel to penetrate the bucket teeth. This process alters the original composition of the wear-resistant alloy, rendering it non-wear-resistant and resulting in a depletion or enrichment of surface alloy elements. This difference in composition between the surface and substrate of the tooling weakens its anti-wear performance, accelerates wear rate, and ultimately reduces tooling life 
Hu’s study 
examined the macroscopic morphology of failed bucket teeth and found that when in contact with ore, deep wear grooves and impact craters were formed on the tooth surface. A thick deformation layer was also observed on the surface of the bucket teeth, with severe plastic deformation in areas where the metal was folded on the wear surface. The wear subsurface morphology showed the appearance of a white bright layer in the deeper subsurface of the wear groove, known as the adiabatic shear layer. Although hard and corrosion resistant, this thin, brittle layer is undesirable, as it is prone to cracking and accelerates machine damage. The occurrence of the adiabatic shear layer indicates poor shear resistance of the material, making it susceptible to plastic destabilization during abrasive particle cutting or deformation. The deformation of the metal generates heat, which raises the temperature and softens the material, promoting further deformation and warming. The heat cannot be transferred to the surrounding area, leading to rapid cooling and the formation of fine martensite organization in the subsequent layer. Observations of the abrasive chip morphology showed small cutting chips. The formation of adiabatic shear layers and the generation of heat during the deformation of the metal can have a significant impact on the wear resistance of bucket teeth. By conducting further research on the material properties and design of bucket teeth, it may be possible to reduce the occurrence of adiabatic shear layers and improve the wear resistance of bucket teeth, ultimately leading to improved efficiency and reduced costs for the construction industry. In conclusion, understanding the morphology of failed bucket teeth and the factors that contribute to their wear resistance is essential for improving the efficiency of construction machinery. Further research on the material properties and design of bucket teeth is necessary for reducing the occurrence of adiabatic shear layers and improving the overall wear resistance of bucket teeth.
Valtonen et al. 
compared the wear of a mining loader bucket’s cutting edge with laboratory samples using various wear testing methods to simulate laboratory conditions. They characterized the wear surfaces and cross sections of the bucket’s cutting edges and test specimens and found that work hardening occurred in all tested bucket wear steels, but the amount of plastic deformation and depth of wear varied. Valtonen et al. 
investigated the hardness of wear-resistant steel and the impact of different abrasives on its wear rate and wear mechanism under laboratory conditions. They discovered that as the hardness of wear-resistant steel increased, the deformation of the wear surface decreased and the scratches produced by abrasive wear were most noticeable in the softer wear-resistant steel. They also found that the effect of abrasive type on the wear mechanism of wear-resistant steel was more significant than the impact of the hardness of wear-resistant steel. Thus, their study suggests that the type of abrasive is a critical factor to consider when examining the wear mechanism of wear-resistant steel. According to Keles et al. 
, bucket teeth develop a slat-martensite synthetic organization after heat treatment. Martensitic microstructure can have various forms such as slat, spiral, lenticular, and thin plates 
. Among these morphologies, slat morphology is typically observed, as it can be easily formed through a simple heat treatment process.