![]() ![]() More columns also required adding to the GROUP BY portion of the query. I ran across this while trying to perform a similar task with a query containing about a dozen columns. I do believe my approach is a bit easier to follow. That is still significantly slower then the other two queries. Adding a key to the user_id on the posts and pages tables avoids the file sort and sped up the slow query to only take 18 seconds. Using EXPLAIN with each of the queries shows that both of your approaches involves a filesort which is avoided with my query. Your updated simpler method took over 2000 times as long (nearly 3 minutes compared to. Limited testing showed nearly identical performance with this query to your query using left join to select subqueries. To test performance differences, I loaded the tables with 16,000 posts and nearly 25,000 pages. (select count(*) from pages where er_id=er_id) as page_count CREATE TABLE sampletable (myId INT) SHOW TABLES LIKE 'sampletable' Output: Tablesindb3xs4qrcrf (sampletable) sampletable. Let’s create a table sampletable in the database and check if it exists. ![]() The second way and pretty easy one is to use SHOW TABLES. (select count(*) from posts where er_id=er_id) as post_count, Use the SHOW TABLES Command to Check if Table Exists in MySQL. My solution involves the use of dependent subqueries. INSERT INTO users (name) VALUES ( 'Jen ') ĬREATE TABLE posts (post_id INT PRIMARY KEY AUTO_INCREMENT, user_id INT) ĬREATE TABLE pages (page_id INT PRIMARY KEY AUTO_INCREMENT, user_id INT) ![]() INSERT INTO users (name) VALUES ( 'Simon ') INSERT INTO users (name) VALUES ( 'Matt ') CREATE TABLE users (user_id INT PRIMARY KEY AUTO_INCREMENT, name VARCHAR( 20)) ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |