Advertisment

Battery Life in Notebooks

author-image
PCQ Bureau
New Update

Many factors affect the amount of time that a portable computer battery can deliver power for, before it must be recharged. These factors include the types of applications being run and display brightness. Here, we discuss the top battery-life benchmarks, how they exercise on the system, and some of the techniques used to more accurately reflect the typical keystroke patterns (including pauses) of end users. We also discuss the main factors that impact battery life and next time show how users can extend the battery life of their portable computers.

Advertisment

Battery-life benchmarks



Benchmarks exercise a system by running a fixed workload repeatedly until the battery forces the system to shut down. The time from the beginning of the test sequence until the system is shut down is the 'battery-life score' for that particular test run. Early battery life benchmarks executed their workloads as fast as the processor could accept the commands. Keyboard input was paused several times in the command sequence to simulate idle time when the user was performing other functions. A similar approach was used in all desktop and portable system-performance benchmarks during that time. An example of this approach was the Ziff Davis BatteryMark benchmark, which ran a sequence of tests that exercised different subsystems.

Direct Hit!
Applies to: Notebook owners
USP: How different kinds of applications affect a notebook's battery life
Primary Link:

veritest.com/benchmarks/ battmark 
Google

keywords:
Battery mark 

There were short pauses between each test and a longer pause at the end of the sequence. This process was repeated continuously as long as the battery lasted. The drawback to this approach is that it did not attempt to simulate the way that users actually type when using an application. The benchmark did not run actual user applications, nor did it properly simulate the pauses associated with tasks such as reading e-mail. Two leading battery-life benchmarks that attempt to better simulate actual usage models are Winstone 2004 BatteryMark and MobileMark 2002.

Advertisment

BatteryMark 



The latest version is BWS 2004 BatteryMark, but earlier versions-BWS 2002 BatteryMark and BatteryMark 4.0.1-are still common. VeriTest for Ziff Davis Media, a major publisher of business and consumer magazines, developed this benchmark. The BWS BatteryMark workload is the same suite of applications that make up the Business Winstone workload. These include:

Lotus Notes R5, Microsoft Office XP (FrontPage, PowerPoint, Excel, Access, and Word), Microsoft Project 2000, Netscape 6.2.1, Norton AntiVirus 2002, WinZip 8.0

Advertisment

Although it consists of the same workload, BWS BatteryMark differs from Business Winstone by including pauses between tasks. The benchmark doesn't simulate the typing speed, including pauses, of a user. BatteryMark reports an overall score that corresponds to the time elapsed from the beginning of the test until the system shuts down due to low-battery condition.

The table of results also reports 'wait time'-the time the system spent sleeping or waiting between tasks-and 'active time'-the time the system spent actually doing work. BAPCo (Business Applications Performance Corporation) developed another benchmark called MobileMark 2002. BAPCo is a non-profit consortium that develops and distributes a set of objective performance benchmarks based on popular computer applications and industry-standard OSs. Current members of BAPCo include Dell, Hewlett-Packard, IBM, Intel, Microsoft, and AMD. MobileMark simulates actual usage models by including variable-length pauses between keystrokes and certain other tasks. This approach more closely matches the typing speed and workflow of real users. This is important because notebooks include power management features that take advantage of these pauses to reduce the system power and prolong battery life. The inclusion of more realistic keystroke pauses allows the benchmark to better reflect the battery life a user will experience when performing similar workloads. 

MobileMark



MobileMark benchmark is unique among other industry battery-life benchmarks. Traditionally, battery life and performance benchmarks are run separately. The battery-life benchmark is run while the portable system (notebook) is on battery power; and the performance benchmark is run while the system is plugged into a power outlet. 

Advertisment

This approach can yield incomplete or misleading information because most notebooks are configured to transition to a reduced performance level when running on battery power to extend battery life. It is possible to manipulate a traditional battery-life benchmark to yield a high battery-life score by artificially reducing the performance to a level that renders the system unsuitable for real-world applications. In contrast, the MobileMark 2002 benchmark measures battery life and system performance simultaneously. Mobile Mark 2002 can be run in both 'productivity' as well as 'reader' workload. The productivity workload mimics a mobile professional accessing e-mail and creating documents, graphics and animation. The benchmark also mimics background file compression and virus-detection operations. The following applications are used during the course of the MobileMark productivity test: Microsoft Word 2002, Microsoft Excel 2002, Microsoft PowerPoint 2002, Microsoft Outlook 2002, Netscape Communicator 6.0, WinZip Computing WinZip 8.0, McAfee VirusScan 5.13, Adobe Photoshop 6.0.1, Macromedia Flash 5 The reader workload emulates a person reading a book or other document using Netscape Communicator 6.0. 

Published battery life numbers use the productivityworkload because this more closely simulates the real-world usage of a typical business portable computer user. 

JEITA, an additional battery-life benchmark, common in Japan, is cited in many battery specifications. JEITA (Japan Electronics and Information Technology Industries Association) does not attempt to simulate typical office productivity workloads. 

Advertisment

Instead, it consists of running an MPEG video file continuously until the notebook shuts down due to low battery. This test tends to produce unusually long battery-life scores. But it does not simulate typical office productivity workloads. 



Workload effects on the benchmarks

The entire test suite design of the BatteryMark and MobileMark benchmarks is based on a defined workload. Here we explain why so much effort is put into defining the workload used in a battery-life benchmark. The combination of workload and keystroke pauses determines how hard the system electronics-CPUs, graphics controllers, and so forth-must work during the test. The harder the system works, the more power it consumes and the shorter the resultant battery life.

Applications like word processing and mail usually have low performance requirements. In contrast, complex scientific analysis programs or 3D applications and games can have high CPU and graphics performance requirements. Graph 1 shows how power consumption can vary for a system and some of its major components while running different applications. Each stack represents the power consumed while running a particular workload or while the system is in 'Windows Idle' mode.1 The height of each stack represents the total system power drawn from the battery. Each stack is broken down into various system components: graphics controller, hard drive, processor, and the remaining components such as memory and core-logic chip set (labeled 'rest of system').

Advertisment

For example, the processor consumes approximately 3 watts running MobileMark 2002, an average of 5-6 watts during DVD playback, and about 11 watts during 3DMark 2000, a benchmark that measures 3D graphics performance.

Graph 2 shows how these power consumption variations affect battery life. This graph shows the battery life of a system with a 53-watt hour

(Whr) battery, that is running the workloads shown in graph 1. Note that the battery life varies significantly between the 3D application workload of 3DMark 2000 at approximately 1.5 hours and the Windows OS idle state at 3.8 hours. 

Graph 2: Battery life under different workloads. These benchmarks were run on a Dell Latitude C600 notebook 

Advertisment

The heavy workload of 3DMark 2000 consumes more power, which shortens the battery life. The fact that battery life varies so significantly, depending on the particular mix of applications running on a portable system, is one of the main reasons that the actual battery life seen by the end users can vary from the estimates published in magazines or on company websites.



After this exhaustive coverage of few of the leading notebook benchmarks available in the market, in this article, the task still remains to understand how one can extend battery life in real life. 

In the next part of this article we will take you through some of the measures to do that.

Gary Verdun, Technology Strategist, Office of the Chief Technology Officer, Dell

Advertisment