| A community of more than 1,600,000 database professionals and growing |
| | Defending the RDBMS A few weeks ago I ran across an essay from Randolph West called, Relational Databases Aren't the Problem. This was a response to another essay that made a case for relational databases being bad for many businesses. I thought that both pieces were interesting for different reasons. Certainly I don't believe the the RDBMS is perfect, and it certainly can be hard for developers to build software that interfaces with a relational system. The original complaint about the RDBMS is somewhat rambling and deceitful, in my opinion. It is an excellent study of how to use a few concepts to confuse and create doubt in a casual reader. If I weren't reading closely, I might fall for a number of the issues that exist with relational databases. However, in my mind, part of the issue is that quite a few of the issues that are discussed aren't problems with relational databases, but often the issue with poorly developed software or design of the entities and relationships. I find myself even more disappointed that the author hasn't really addressed any comments, but rather just pasted a link to his followup article. I do think that the defense from Mr. West does a good job, though it also misses some of the primary issues we struggle with relational databases. There are problems with the knowledge of how to build a well performing database, both from application developers that view this as a necessary evil as well as experienced database developers that don't regularly improve their skills and try new design techniques. I also think that both of the pieces fail to address the issues of gathering and working with multiple rows of data. The second discussion of "doing without databases" really implements its own database management structure, which may work well, but is fraught with issues such as the concurrency issues of multiple users searching and scanning through data without having indexes. While indexes are overhead, they are necessary as hash buckets aren't necessarily feasible for all the properties in a class. Also, if you end up building them for multiple properties, you're building an index. There's another good defense of some of the issues here. I do think that keeping more data in memory and synchronizing access to structures sounds great, but scaling that out to multiple systems, and ensuring consistency at high volumes, not to mention potential loss of data issues from crashes are a problem. Having a write ahead log in SQL Server does a wonderful job of ensuring we can handle redo/undo on system restart. The method presented doesn't necessarily ensure this, though perhaps accepting some data loss from high concurrency changes is OK for many applications. I will say that the idea of all data in memory is interesting. I had to stop and think about how many databases really have more than 1TB of data. If we throw out indexes, does this cover most data stores? I bet this does, though that doesn't mean that there aren't issues with using in memory array structures, with widely varying data sizes. Would I use an in-memory data structure for software? It's tempting, but honestly, I wouldn't. The value of data is too high, with potential issues from poorly implemented ACID control structures. Plenty of issues have been found with different RDBMSs over their years, and even some in NoSQL systems. Thinking that I could avoid any issues and protect data is something I wouldn't even try. After all, if there is some error, I'd prefer it from a system that many people use, rather than one I tried to emulate for no good reason. Steve Jones from SQLServerCentral.comJoin the debate, and respond to today's editorial on the forums |
| The Voice of the DBA Podcast Listen to the MP3 Audio ( 6.2MB) podcast or subscribe to the feed at iTunes and Libsyn. The Voice of the DBA podcast features music by Everyday Jones. No relation, but I stumbled on to them and really like the music. | |
|
|
| ADVERTISEMENT | | Benchmark your Database DevOps maturity level Get a better understanding of how advanced your current processes are, receive recommendations for improvements, and see how your maturity level compares with that of your peers. Complete the Database DevOps Maturity Assessment |
| | What’s the top challenge faced by SQL Server professionals in 2018? Learn how 626 SQL Server professionals monitor their estates in our new report on the State of SQL Server Monitoring. Discover the challenges currently facing the industry, and what is coming next. Download your free copy of the report |
|
|
|
| | | Additional Articles from SimpleTalk As running SQL Server on Linux becomes more common, DBAs must learn and become comfortable with the Linux OS. In this article, Kellyn Pot'Vin-Gorman demonstrates how to create a SQL Server instance running in Linux via a Docker container. More » |
| You’ll probably have heard of the Accelerate State of DevOps Report from DORA. Now in its fifth year and backed by rigorous research involving 30,000+ professionals worldwide, it has consistently shown that higher software delivery performance delivers powerful business outcomes. More » |
| Bert Wagner from SQLServerCentral Blogs When beginning to learn SQL, at some point you learn that indexes can be created to help improve the performance... More » |
| Kenneth Fisher from SQLServerCentral Blogs tl;dr; While the difference is very important 90% of the time you won’t care and should just add the two... More » |
|
|
| | Today's Question (by Steve Jones): I want to use a table valued parameter (TVP) in a procedure. How do I declare the structure of the TVP? |
Think you know the answer? Click here, and find out if you are right. We keep track of your score to give you bragging rights against your peers. This question is worth 1 point in this category: Table-Valued Parameter. We'd love to give you credit for your own question and answer. To submit a QOTD, simply log in to the Contribution Center. |
|
|
| |
ADVERTISEMENT | Design and configure SQL Server instances and databases in support of high-throughput applications that are mission-critical and provide consistent response times in the face of variations in user numbers and query volumes. Learn to configure SQL Server and design your databases to support a given instance and workload. Pick up your copy of this great book today at Amazon today. |
|
|
|
|
|
| Yesterday's Question of the Day |
| Yesterday's Question (by Steve Jones): I have created a few vectors in R with this code: > west.teams = c('Diamondbacks', 'Rockies', 'Dodgers', 'Giants', 'Padres') > west.wins = c(72,71,70,65,50) > west.losses = c(58,59,61,67,83) I want to combine these vectors into a data frame. How do I do this? Answer: NL.west = data.frame(west.teams, west.wins, west.losses) Explanation: The data.frame() function can be used to combine vectors into a data frame. Ref: Data Frame - click here » Discuss this question and answer on the forums |
|
|
| Database Pros Who Need Your Help |
| Here's a few of the new posts today on the forums. To see more, visit the forums. Database size increased after dacpac applied - Hi, Database size is increased after applying dacpac database size - 285 GB (before dacpac applied) database size - 532 GB (after dacpac applied) Scan documents... Cost Threshold For Parallelism - Your opinion - Hi, I'm not going to ask what its for or anything, I understand it and its whole history back to Nick's... update to SQL 2016 with log shipping - Howdy all.. This is more of a general question... we currently are running our production applications on an instance of SQL... Fast Database Clone - Hi, What is the fastest way to clone a database of 50 GB every day (SQL2014), without any interruption (detach / attache).... SQL 2016 SSRS Native Mode - Report Loading Time - We are running SQL 2016 SSRS in native mode using the web portal. If a user is part of the... Hide schema from all users (except people with SA access) - Hi all I need to create a schema to store a set of tables but I don't want anyone (including those... Updating in duplicate records - Dynamic creating indexes - We have more than 100 indexes exist in our DB. Sometimes due to some processes a part or all the... Importing from XML to SQL 2014 - This is my 4th day on SQL so I am a newbie. I created a query to import from an... Best Practice Help - Table Structures - Hi there, I am being asked to provide a list of all the reasons that we should create actual normalized... SQL Server has encountered 1 occurrence(s) of I/O requests taking longer than 15 seconds - Hi there - Last night our application experienced a brief outage due to the following error which I found in the... Does a check constraint with UDF has performance impact on Query? - Hi all We have a Master table that has a primary key with two columns ( Code char(4), location int ). We have... separate pipe values - Is there a SQL to separate pipe values into separate rows? This is how the values are stored in my... Select Max dates which lower than current date - Hello I have the below table which has the records of the payments, I want to select the kids whom last... Last Security Patch - Dear Experts, Can anybody help me know how to check what the last security patch was applied to our SQL Server... Delete Old Tables - dear Friends, Kindly help me, I want to make a syntax that compares if the number of rows in table A... How do I write a record for each day in a date range??? - Hi To make it simple I have a @StartDate and @EndDate I choose students with an admit date in that range from the... Rounding decimal times into seconds. - Hi I am using the below code to convert decimal times into hh:mm:ss e.g. 0.18 minutes = 10.8 seconds rounder up to 11... Restore backup from SQL 2017 to SQL 2008 - Hi. There is a lot of post about my question, but I have little different problem. We have an SQL... SQL Azure Query Editor Transaction Blog - I could not find any logical explanation about the following issue. I connected SQL Azure Query Editor and I executed the... |
|
| This email has been sent to newsletter@newslettercollector.com. To be removed from this list, please click here. If you have any problems leaving the list, please contact the webmaster@sqlservercentral.com. | This newsletter was sent to you because you signed up at SQLServerCentral.com. Feel free to forward this to any colleagues that you think might be interested. If you have received this email from a colleague, you can register to receive it here. | This transmission is ©2018 Redgate Software Ltd, Newnham House, Cambridge Business Park, Cambridge, CB4 0WZ, United Kingdom. All rights reserved. Contact: webmaster@sqlservercentral.com |
|
|