memory performance

Results 1 - 25 of 47Sort Results By: Published Date | Title | Company Name
Published By: Amazon Web Services     Published Date: Nov 14, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources. This e-book aims to provide you with expert tips on how to use Amazon Redshift Spectrum to increase performance and potentially reduce the cost of your queries.
Tags : 
    
Amazon Web Services
Published By: Amazon Web Services     Published Date: Oct 09, 2017
Enterprise customers can take advantage of the many benefits provided by Amazon Web Services (AWS) to achieve business agility, cost savings, and high availability by running their SAP environments on the AWS Cloud. Many Enterprise customers run SAP production workloads on AWS today; including those that run on NON-SAP DBs (Oracle, MS SQL, DB2) or on SAP DBs (SAP HANA, SAP ASE). To support the demand of high memory instances, AWS have disclosed their SAP HANA instance roadmap (8TB and 16TB in 2018) and just made 4TB x1e instances available. A few examples of how AWS helped SAP customers cut costs, improve performance and agility include BP reducing 1/3 of their SAP infrastructure cost, Zappos successfully migrating to SAP HANA on AWS in less than 48 hours and enabling a major Healthcare and Life Science company to run BW on HANA with 30% better performance vs. on premise. This guide is intended for SAP customers and partners who want to learn about the benefits and options for running SAP solutions on AWS, or who want to know how to implement and operate their SAP environment effectively on AWS.
Tags : 
storage, networking, management, aws, sap, infrastructure, lower tco, capex, opex
    
Amazon Web Services
Published By: CCSS     Published Date: Jul 05, 2007
Capability breeds dependence. The more you can do, the more is asked of you. The more time you save, the more time you spend seeking out ways to save more time or money. If less really is more, how do you do more with less? Effective system monitoring could help you break free of the system shackles and use your System i (iSeries) network to save time and money.
Tags : 
ibm, system monitoring, monitoring, network monitoring, productivity, performance management, system performance, memory pool performance, memory pool, lpar, system configuration, performance monitoring, ccss, network management
    
CCSS
Published By: Dell     Published Date: Aug 16, 2013
"Virtualization introduces new challenges for managing your data center. Applications now compete for shared resources such as storage, CPU cycles and memory, often causing performance bottlenecks and unhappy customers. This paper details 20 metrics that indicate when capacity issues are occurring and recommends a corresponding tool that will analyze the data and resolve your VM performance issues. Read the White Paper "
Tags : 
vmware, metrics, virtualization, bandwidth management, infrastructure
    
Dell
Published By: Dell     Published Date: May 13, 2016
Your growing business shouldn't run on aging hardware and software until it fails. Adding memory and upgrading processors will not provide the same benefits to your infrastructure as a consolidation and upgrade can. Upgrading and consolidating your IT infrastructure to the Dell PowerEdge VRTX running Microsoft Windows Server 2012 R2 and SQL Server 2014 can improve performance while adding features such as high availability.
Tags : 
    
Dell
Published By: Dell     Published Date: May 13, 2016
No matter your line of business, technology implemented four years ago is likely near its end of life and may be underperforming as more users and more strenuous workloads stretch your resources thin. Adding memory and upgrading processors won't provide the same benefits to your infrastructure as a consolidation and upgrade can. Read this research report to learn how upgrading to Dell's PowerEdge VRTX with Hyper-V virtualization, Microsoft Windows Server 2012 R2, and Microsoft SQL Server 2014 could reduce costs while delivering better performance than trying to maintain aging hardware and software.
Tags : 
    
Dell
Published By: Dell     Published Date: Jul 08, 2016
Dell Virtual SAN Ready Nodes with Horizon abstract and aggregate compute and memory resources into logical pools of compute capacity, while Virtual SAN pools server-attached storage to create a high-performance, shared datastore for virtual machines.
Tags : 
technology, best practices, poweredge, hyper converged, networking, mobile computing
    
Dell
Published By: Dell EMC     Published Date: May 11, 2016
Your growing business shouldn't run on aging hardware and software until it fails. Adding memory and upgrading processors will not provide the same benefits to your infrastructure as a consolidation and upgrade can. Upgrading and consolidating your IT infrastructure to the Dell PowerEdge VRTX running Microsoft Windows Server 2012 R2 and SQL Server 2014 can improve performance while adding features such as high availability.
Tags : 
    
Dell EMC
Published By: Dell EMC     Published Date: May 11, 2016
No matter your line of business, technology implemented four years ago is likely near its end of life and may be underperforming as more users and more strenuous workloads stretch your resources thin. Adding memory and upgrading processors won't provide the same benefits to your infrastructure as a consolidation and upgrade can. Read this research report to learn how upgrading to Dell's PowerEdge VRTX with Hyper-V virtualization, Microsoft Windows Server 2012 R2, and Microsoft SQL Server 2014 could reduce costs while delivering better performance than trying to maintain aging hardware and software.
Tags : 
    
Dell EMC
Published By: Dell Software     Published Date: Aug 15, 2013
Virtualization means applications now compete for resources such as storage, CPU cycles and memory, often leading to bottlenecks and unhappy customers. Find out how you can improve your VM performance. Read the White Paper >>
Tags : 
dell, virtualization, applications, storage, cpu, memory, data center, metrics
    
Dell Software
Published By: Diskeeper Corporation     Published Date: Oct 30, 2008
Apacer, a global leader in memory modules, announced the introduction of SSD+ Optimizer, the world’s first Solid State Drive (SSD) optimization solution with HyperFast technology from performance and reliability innovator Diskeeper Corporation®.
Tags : 
diskeeper corporation, windows, apacer, nand, hyperfast, storage
    
Diskeeper Corporation
Published By: Diskeeper Corporation     Published Date: Oct 30, 2008
Virtualization does have its dangers, as it incurs greater stress on physical resources. While under utilization of CPU may be a driving factor to virtualize servers, other hardware resources may become overtaxed. Given that a host system has limited ability (depends on application) to page memory used by the guest systems, the most recognized bottleneck to address is physical memory (RAM). Options to programmatically alleviate memory bottlenecks incur performance issues when the disk is re-introduced. Another major component and perhaps less acknowledged is the disk subsystem. In many cases, depending the purpose and application of the guest/virtual systems, the disk bottleneck will be the most significant barrier to performance. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4
Tags : 
diskeeper corporation, virtualize servers, virtualizations, divergence, convergence, storage
    
Diskeeper Corporation
Published By: HP     Published Date: Nov 05, 2014
Organizations no longer have to wait months or years to deploy an all-flash storage array into their environment to host their applications. The technologies in this most recent iteration of the HP 3PAR StoreServ 7450 ensure that organizations get the performance they need, the cost at which they need it and the platform stability to offer it up to as many applications as they see fit. By taking advantage of the HP 3PAR StoreServ 7450 platform, organizations may confidently begin their journey into the all-flash world of tomorrow today with the knowledge that it will meet their various manageability, performance and scalability requirements along the way. Download this whitepaper now.
Tags : 
flash memory, memory storage, buying decision, all-flash storage, data management, storage management, data reduction, performance production, proven data, software service, storeserv
    
HP
Published By: HPE Intel     Published Date: Jan 11, 2016
The world of storage is being transformed by the maturing of flash arrays, an approach to storage that uses multiple, solid state flash memory drives instead of spinning hard disk drives. An all-flash array performs the same functions as traditional spinning disks but in a fraction of the time required and in more compact form factors. Given its superior performance in certain contexts, all-flash arrays are experiencing strong industry adoption. However, best practices and a true understanding of key success factors for all- flash storage are still emerging. This paper is intended to educate you on best practices based on real user experience drawn from ITCentralStation.com. We offer all-flash user advice in selecting and building the business case for a flash array storage solution.
Tags : 
    
HPE Intel
Published By: IBM     Published Date: Feb 02, 2009
Learn how IBM’s change data capture technology can be used in conjunction with IBM’s performance management solutions from Cognos to provide access to the trusted information that systems and employees need to make informed decisions at the speed of business.
Tags : 
ibm, information management software, performance management, business intelligence, ibm cognos, bi-ready data, departmental reporting with ibm cognos, in-memory data store, operational data stores, enterprise data warehouse, ibm® infosphere change data capture, cdc, kpi, cognos now!, operational data store, ods, ibm infosphere datastage
    
IBM
Published By: IBM     Published Date: Mar 13, 2014
3 factors are needed by an infrastructure to optimize cloud performance, this eBook shows how IBM System x uses all 3 to increase business agility and reduce costs.
Tags : 
ibm, server design, memory, processor, software, processing power, data, cloud performance, business agility, cloud computing
    
IBM
Published By: IBM     Published Date: Oct 06, 2014
Born from new advances in data processing from IBM Research, IBM® DB2® with BLU Acceleration is a leap forward in database technology that raises the bar for performance and value. BLU Acceleration uses patented technologies to deliver a unique combination of performance, ease of use and cost-efficiency—with 8 to 25 times faster reporting and analytics1 and cases of more than 1,000 times faster answers to queries.2 BLU Acceleration also complements in-memory Dynamic Cubes in IBM Cognos® Business Intelligence with 24 times faster query performance.
Tags : 
data processing, blu accerleration, database technology, data analytics
    
IBM
Published By: IBM Corp     Published Date: Jun 20, 2011
Change is always a little scary, but depending on how old your technology is, new gear may offer tremendous performance benefits to your organization. Read more here.
Tags : 
ibm, express seller, server refresh, memory capacity, network performance, migration to vm, consolidation of servers, flexibility, reduced downtime, maintenanc, disaster recovery, solutions, performance, mtbf, cost of capital, reduce expenses, server hardware, servers
    
IBM Corp
Published By: Intel Corporation     Published Date: Aug 25, 2014
Sponsored by: NEC and Intel® Xeon® processor Servers with the Intel® Xeon® processor E7 v2 family in a four-CPU configuration can deliver up to twice the processing performance, three times the memory capacity, and four times the I/O bandwidth of previous models. Together with their excellent transaction processing performance, these servers provide a high level of availability essential to enterprise systems via advanced RAS functions that guarantee the integrity of important data while also reducing costs and the frequency of server downtime. Intel, the Intel logo, Xeon, and Xeon Inside are trademarks or registered trademarks of Intel Corporation in the U.S. and/or other countries.
Tags : 
enterprise systems, platform, datacenter, servers, virtualization, customer value, analytics, application owners, system integrators, big data, reliability, enterprise, availability, serviceability, processor, performance testing, server virtualization
    
Intel Corporation
Published By: Internap     Published Date: Dec 02, 2014
NoSQL databases are now commonly used to provide a scalable system to store, retrieve and analyze large amounts of data. Most NoSQL databases are designed to automatically partition data and workloads across multiple servers to enable easier, more cost-effective expansion of data stores than the single server/scale up approach of traditional relational databases. Public cloud infrastructure should provide an effective host platform for NoSQL databases given its horizontal scalability, on-demand capacity, configuration flexibility and metered billing; however, the performance of virtualized public cloud services can suffer relative to bare-metal offerings in I/O intensive use cases. Benchmark tests comparing latency and throughput of operating a high-performance in-memory (flash-optimized), key value store NoSQL database on popular virtualized public cloud services and an automated bare-metal platform show performance advantages of bare-metal over virtualized public cloud, further quant
Tags : 
internap, performance analysis, benchmarking, nosql, bare-metal, public cloud, infrastructure, on demand capacity, data center
    
Internap
Published By: Kingston     Published Date: Feb 15, 2011
Learn how to balance the positive and negative effects of memory utilization in virtual infrastructures to better handle system workload and priority--while improving server utilization
Tags : 
virtualization, memory overcommitment, memory performance, memory reclamation, vmware, memory utilization, consolidation ratios, server memory, server utilization, memory ballooning, vmkernal swapping, capacity planning, vmware esx 4.0, active directory, bandwidth management, convergence, distributed computing, ethernet networking, fibre channel, gigabit networking
    
Kingston
Published By: Mentor Graphics     Published Date: Apr 03, 2009
A powerful signal integrity analysis tool must be flexibility, easy to use and integrated into an existing EDA framework and design flow. In addition, it is important for the tool to be accurate enough. This report reviews a validation study for the Mentor Graphics HyperLynx 8.0 PI tool to establish confidence in using it for power integrity analysis.
Tags : 
mentor graphics, pdn simulation, eda framework, mentor hyperlynx 8.0 pi, integrity analysis, virtual prototypes, esr, capacitor, power distribution network, vrm, voltage regulator module, signal, smas, analog models, backward crosstalk, capacitive crosstalk, controlling crosstalk, correct emc problems, correct emi problems, cross talk
    
Mentor Graphics
Published By: Mentor Graphics     Published Date: Apr 03, 2009
For advanced signaling over high-loss channels, designs today are using equalization and several new measurement methods to evaluate the performance of the link. Both simulation and measurement tools support equalization and the new measurement methods, but correlation of results throughout the design flow is unclear. In this paper a high performance equalizing serial data link is measured and the performance is compared to that predicted by simulation. Then, the differences between simulation and measurements are discussed as well as methods to correlate the two.
Tags : 
mentor graphics, equalized serial data links, design flow, high loss channels, tektronix, pcb, bit error rate, ber, ieee, serdes, simulation, system configuration, mentor graphics hyperlynx, simplified symmetric trapezoidal input, duty cycle distortion, ber contours, electronics, analog models, backward crosstalk, capacitive crosstalk
    
Mentor Graphics
Published By: NetApp     Published Date: Aug 26, 2016
How has flash evolved over the years? This infographic takes you from the invention of flash memory in 1980 to its introduction in storage systems in 2008 to where we are today: an all-flash platform that supplies flexible, independent scaling of both capacity and performance.
Tags : 
netapp, database performance, flash storage, data management, cost challenges
    
NetApp
Previous   1 2    Next    
Search Research Library      

Add Research

Get your company's research in the hands of targeted business professionals.