Search found 37 matches
- Thu 09 Jul 2020 15:21
- Forum: dotConnect for Cloud Applications
- Topic: Where empty string or Null
- Replies: 2
- Views: 20439
Re: Where empty string or Null
I think I tried that as well and received an soql error. I'll give it another shot though. Thanks.
- Mon 29 Jun 2020 15:36
- Forum: dotConnect for Cloud Applications
- Topic: Where empty string or Null
- Replies: 2
- Views: 20439
Where empty string or Null
Hello. Using Devart.Data.Salesforce 3.5.980.0 in VS 2017.
I’m attempting to run a query on the Salesforce Campaign object. I’m wanting to return all rows where the ParentID field (Reference Field) is blank, null, empty, whatever.
I’ve tried: and also tried and also tried
Both methods return no rows, even though rows exist where the ParentID is blank. What other options can I try?
Thanks.
Chris
I’m attempting to run a query on the Salesforce Campaign object. I’m wanting to return all rows where the ParentID field (Reference Field) is blank, null, empty, whatever.
I’ve tried:
Code: Select all
Campaign.ParentID= ‘’
Code: Select all
Campaign.ParentID= Null
Code: Select all
(Campaign.ParentID= ‘’ OR Campaign.ParentID= Null)
Both methods return no rows, even though rows exist where the ParentID is blank. What other options can I try?
Thanks.
Chris
- Thu 14 May 2020 03:00
- Forum: dotConnect for Cloud Applications
- Topic: Downloading Backups from Salesforce
- Replies: 2
- Views: 12480
Re: Downloading Backups from Salesforce
Okay, thanks for the reply.
- Mon 11 May 2020 18:28
- Forum: dotConnect for Cloud Applications
- Topic: Downloading Backups from Salesforce
- Replies: 2
- Views: 12480
Downloading Backups from Salesforce
Hello,
Is it possible to download a file from Salesforce (In this case I’m looking to grab a .zip file created by the backup process) using the dotConnect for Salesforce component?
If so, do you have an example?
Thanks,
Chris
Is it possible to download a file from Salesforce (In this case I’m looking to grab a .zip file created by the backup process) using the dotConnect for Salesforce component?
If so, do you have an example?
Thanks,
Chris
- Thu 09 Apr 2020 13:58
- Forum: dotConnect for PostgreSQL
- Topic: Error after installing dotConnect for Salesforce
- Replies: 2
- Views: 3292
Re: Error after installing dotConnect for Salesforce
Thank you.
I was able to find installs that contains the same version of the devart.data.dll component and after running both all is well.
Thank you for your help.
Chris.
I was able to find installs that contains the same version of the devart.data.dll component and after running both all is well.
Thank you for your help.
Chris.
- Tue 07 Apr 2020 19:47
- Forum: dotConnect for PostgreSQL
- Topic: Error after installing dotConnect for Salesforce
- Replies: 2
- Views: 3292
Error after installing dotConnect for Salesforce
Hi there,
I understand that this is the dotConnect for PostgreSQL forum. I currently have dotConnect for PostgreSQL in my project. Recently I downloaded and installed a trial for dotConnect for Salesforce. Afterwards I started Visual Studio (2017) and received two errors.
The 'Devart.Data.PostgreSQL.Vs.PgSqlDataProvidorPackage' Package (5.0.24040.0) did not load correctly.
and the same error for the Devart.data.salesforce package showing a different version (5.0.2375.0).
I understand that they both use the same Devart.Data.dll. I pointed the project to the PostgreSQL one as it is newer.
How do I resolve this conflict?
Thanks,
Chris Campbell
I understand that this is the dotConnect for PostgreSQL forum. I currently have dotConnect for PostgreSQL in my project. Recently I downloaded and installed a trial for dotConnect for Salesforce. Afterwards I started Visual Studio (2017) and received two errors.
The 'Devart.Data.PostgreSQL.Vs.PgSqlDataProvidorPackage' Package (5.0.24040.0) did not load correctly.
and the same error for the Devart.data.salesforce package showing a different version (5.0.2375.0).
I understand that they both use the same Devart.Data.dll. I pointed the project to the PostgreSQL one as it is newer.
How do I resolve this conflict?
Thanks,
Chris Campbell
- Fri 03 Apr 2020 14:35
- Forum: dotConnect for Cloud Applications
- Topic: Connect to Salesforce scratch/Sandbox environment
- Replies: 3
- Views: 11901
Re: Connect to Salesforce scratch/Sandbox environment
Well, turns out I was provided an incorrect Token. Sorry for the noise.
FYI: To successfully connect to a scratch environment, use: test.salesforce.com
I'm guessing most people know this but there you are.
FYI: To successfully connect to a scratch environment, use: test.salesforce.com
I'm guessing most people know this but there you are.
- Thu 02 Apr 2020 19:07
- Forum: dotConnect for Cloud Applications
- Topic: Connect to Salesforce scratch/Sandbox environment
- Replies: 3
- Views: 11901
Connect to Salesforce scratch/Sandbox environment
Hi, I'm a long time user of your dotConnect for PostgreSQL and recently stumbled across your dotConnect for Salesforce.
In testing, I'm attempting to establish a connection to a scratch environment and it's not allowing access. I get the message:
Invalid User Name, Password, Security Token, or User is locked out.
Curiously I can sign into a production environment just fine.
For the URL I'm using test.salesforce.com
For Login Name I'm using the same login name that works manually signing in
For Password I'm using the same that works manually signing in.
I've confirmed that the user name, password, and security token are correct. What are some other options I can look at?
Thanks.
Chris Campbell
In testing, I'm attempting to establish a connection to a scratch environment and it's not allowing access. I get the message:
Invalid User Name, Password, Security Token, or User is locked out.
Curiously I can sign into a production environment just fine.
For the URL I'm using test.salesforce.com
For Login Name I'm using the same login name that works manually signing in
For Password I'm using the same that works manually signing in.
I've confirmed that the user name, password, and security token are correct. What are some other options I can look at?
Thanks.
Chris Campbell
- Thu 24 Aug 2017 16:46
- Forum: dotConnect for PostgreSQL
- Topic: Server did not respond within the specified timeout interval
- Replies: 10
- Views: 23411
Re: Server did not respond within the specified timeout interval
So after resurrecting this issue I’m happy to report that we can again let it rest in peace.
As I said earlier, the pgConnection object does not have a property for Default Command Timeout. However, I was able to insert it during the connection initialization.
Once I did this it worked fine.
Sorry for the noise.
As I said earlier, the pgConnection object does not have a property for Default Command Timeout. However, I was able to insert it during the connection initialization.
Code: Select all
mPgCn = New PgSqlConnection("Default Command Timeout=0")
Sorry for the noise.
- Thu 24 Aug 2017 16:02
- Forum: dotConnect for PostgreSQL
- Topic: Server did not respond within the specified timeout interval
- Replies: 10
- Views: 23411
Re: Server did not respond within the specified timeout interval
Hello,
I'm resurrecting this issue because there was never a resolution posted about the error message itself.
Devart.Data version 5.0.1750.0
Devart.Data.PostgreSQL version 7.9.958.0
Here is the code:
mPgCn = New PgSqlConnection
mPgCn.Name = "dt4"
mPgCn.Host = gstrpg_HostName
mPgCn.Port = gstrpg_Port
mPgCn.Database = gstrpg_DatabaseName
mPgCn.UserId = gstrpg_Userid
mPgCn.Schema = IIf(blnPublic = False, gstrpg_Schema, "public")
mPgCn.Password = gstrpg_PW
mPgCn.ConnectionTimeout = 0
mPgCn.Unicode = True
mPgCn.Open()
cmdRunPG = mPgCn.CreateCommand
cmdRunPG.CommandTimeout = 0
cmdRunPG.CommandText = "set application_name = '" & gclsSession.App_ApplicationName & "'"
cmdRunPG.ExecuteNonQuery()
dAdaptPG = New PgSqlDataAdapter(strSQL, mPgCn)
dAdaptPG.Fill(dSet) ' (Boom Line)
Error Message:
Server did not respond within the specified timeout interval.
Connection String:
User Id=postgres;Host=localhost;Database=dt4_0000;Unicode=True;Connection Timeout=0;Initial Schema=ds0837;"
Stack Trace:
at Devart.Data.PostgreSql.PgSqlDataReader.f(Int32 A_0)
at Devart.Data.PostgreSql.PgSqlCommand.InternalExecute(CommandBehavior behavior, IDisposable stmt, Int32 startRecord, Int32 maxRecords)
at Devart.Common.DbCommandBase.InternalExecute(CommandBehavior behavior, IDisposable stmt, Int32 startRecord, Int32 maxRecords, Boolean nonQuery)
at Devart.Common.DbCommandBase.ExecuteDbDataReader(CommandBehavior behavior, Boolean nonQuery)
at Devart.Common.DbCommandBase.ExecuteDbDataReader(CommandBehavior behavior)
at System.Data.Common.DbCommand.System.Data.IDbCommand.ExecuteReader(CommandBehavior behavior)
at System.Data.Common.DbDataAdapter.FillInternal(DataSet dataset, DataTable[] datatables, Int32 startRecord, Int32 maxRecords, String srcTable, IDbCommand command, CommandBehavior behavior)
at System.Data.Common.DbDataAdapter.Fill(DataSet dataSet, Int32 startRecord, Int32 maxRecords, String srcTable, IDbCommand command, CommandBehavior behavior)
at System.Data.Common.DbDataAdapter.Fill(DataSet dataSet)
at Donation_Tracker.clsData.GetDatasetTable(String strSQL, String strNodeName, Boolean blnPublic) in C:\DevNet\Donation Tracker\Main\clsData.vb:line 9946
Postgres Server Log of Event:
2017-08-24 08:37:59 PDT LOG could not send data to client: An established connection was aborted by the software in your host machine.
2017-08-24 08:37:59 PDT STATEMENT Select contact.contactkey, contact.accountname, contact.title, contact.firstname, contact.middlename, contact.lastname, contact.suffix, CASE WHEN contact.searchname = '' THEN contact.accountname ELSE contact.searchname END as searchname, contacttype.descript as contacttype, contact.fk_contacttype, contact.notes as accountnotes, contact.dob_day, contact.dob_month, contact.dob_year, CASE WHEN contact.accountcode<>'' THEN 'Contact ID: ' || contact.contactkey || ' Code: ' || contact.accountcode ELSE 'Contact I
2017-08-24 08:37:59 PDT FATAL connection to client lost
2017-08-24 08:37:59 PDT STATEMENT Select contact.contactkey, contact.accountname, contact.title, contact.firstname, contact.middlename, contact.lastname, contact.suffix, CASE WHEN contact.searchname = '' THEN contact.accountname ELSE contact.searchname END as searchname, contacttype.descript as contacttype, contact.fk_contacttype, contact.notes as accountnotes, contact.dob_day, contact.dob_month, contact.dob_year, CASE WHEN contact.accountcode<>'' THEN 'Contact ID: ' || contact.contactkey || ' Code: ' || contact.accountcode ELSE 'Contact I
Running the query itself in pgAdmin III takes about five minutes. When executed through the data connection it times-out after 60 seconds.
I searched for a "DefaultConnectionTimeout" property on the data connection object and could not find it.
Any tips would be most appreciated.
Chris Campbell
I'm resurrecting this issue because there was never a resolution posted about the error message itself.
Devart.Data version 5.0.1750.0
Devart.Data.PostgreSQL version 7.9.958.0
Here is the code:
mPgCn = New PgSqlConnection
mPgCn.Name = "dt4"
mPgCn.Host = gstrpg_HostName
mPgCn.Port = gstrpg_Port
mPgCn.Database = gstrpg_DatabaseName
mPgCn.UserId = gstrpg_Userid
mPgCn.Schema = IIf(blnPublic = False, gstrpg_Schema, "public")
mPgCn.Password = gstrpg_PW
mPgCn.ConnectionTimeout = 0
mPgCn.Unicode = True
mPgCn.Open()
cmdRunPG = mPgCn.CreateCommand
cmdRunPG.CommandTimeout = 0
cmdRunPG.CommandText = "set application_name = '" & gclsSession.App_ApplicationName & "'"
cmdRunPG.ExecuteNonQuery()
dAdaptPG = New PgSqlDataAdapter(strSQL, mPgCn)
dAdaptPG.Fill(dSet) ' (Boom Line)
Error Message:
Server did not respond within the specified timeout interval.
Connection String:
User Id=postgres;Host=localhost;Database=dt4_0000;Unicode=True;Connection Timeout=0;Initial Schema=ds0837;"
Stack Trace:
at Devart.Data.PostgreSql.PgSqlDataReader.f(Int32 A_0)
at Devart.Data.PostgreSql.PgSqlCommand.InternalExecute(CommandBehavior behavior, IDisposable stmt, Int32 startRecord, Int32 maxRecords)
at Devart.Common.DbCommandBase.InternalExecute(CommandBehavior behavior, IDisposable stmt, Int32 startRecord, Int32 maxRecords, Boolean nonQuery)
at Devart.Common.DbCommandBase.ExecuteDbDataReader(CommandBehavior behavior, Boolean nonQuery)
at Devart.Common.DbCommandBase.ExecuteDbDataReader(CommandBehavior behavior)
at System.Data.Common.DbCommand.System.Data.IDbCommand.ExecuteReader(CommandBehavior behavior)
at System.Data.Common.DbDataAdapter.FillInternal(DataSet dataset, DataTable[] datatables, Int32 startRecord, Int32 maxRecords, String srcTable, IDbCommand command, CommandBehavior behavior)
at System.Data.Common.DbDataAdapter.Fill(DataSet dataSet, Int32 startRecord, Int32 maxRecords, String srcTable, IDbCommand command, CommandBehavior behavior)
at System.Data.Common.DbDataAdapter.Fill(DataSet dataSet)
at Donation_Tracker.clsData.GetDatasetTable(String strSQL, String strNodeName, Boolean blnPublic) in C:\DevNet\Donation Tracker\Main\clsData.vb:line 9946
Postgres Server Log of Event:
2017-08-24 08:37:59 PDT LOG could not send data to client: An established connection was aborted by the software in your host machine.
2017-08-24 08:37:59 PDT STATEMENT Select contact.contactkey, contact.accountname, contact.title, contact.firstname, contact.middlename, contact.lastname, contact.suffix, CASE WHEN contact.searchname = '' THEN contact.accountname ELSE contact.searchname END as searchname, contacttype.descript as contacttype, contact.fk_contacttype, contact.notes as accountnotes, contact.dob_day, contact.dob_month, contact.dob_year, CASE WHEN contact.accountcode<>'' THEN 'Contact ID: ' || contact.contactkey || ' Code: ' || contact.accountcode ELSE 'Contact I
2017-08-24 08:37:59 PDT FATAL connection to client lost
2017-08-24 08:37:59 PDT STATEMENT Select contact.contactkey, contact.accountname, contact.title, contact.firstname, contact.middlename, contact.lastname, contact.suffix, CASE WHEN contact.searchname = '' THEN contact.accountname ELSE contact.searchname END as searchname, contacttype.descript as contacttype, contact.fk_contacttype, contact.notes as accountnotes, contact.dob_day, contact.dob_month, contact.dob_year, CASE WHEN contact.accountcode<>'' THEN 'Contact ID: ' || contact.contactkey || ' Code: ' || contact.accountcode ELSE 'Contact I
Running the query itself in pgAdmin III takes about five minutes. When executed through the data connection it times-out after 60 seconds.
I searched for a "DefaultConnectionTimeout" property on the data connection object and could not find it.
Any tips would be most appreciated.
Chris Campbell
- Tue 30 May 2017 15:39
- Forum: dotConnect for PostgreSQL
- Topic: pgDump TAR format
- Replies: 2
- Views: 2600
Re: pgDump TAR format
Okay. Thank you.
- Tue 30 May 2017 15:38
- Forum: dotConnect for PostgreSQL
- Topic: pgDump Compression
- Replies: 2
- Views: 2642
Re: pgDump Compression
Okay. Thank you.
- Fri 26 May 2017 14:40
- Forum: dotConnect for PostgreSQL
- Topic: pgDump Compression
- Replies: 2
- Views: 2642
pgDump Compression
Hi, does the pgDump.backup and restore support the -Fc and -Z0 parameters? If so how would you recommend using them.
Thanks.
Chris
Thanks.
Chris
- Fri 26 May 2017 13:56
- Forum: dotConnect for PostgreSQL
- Topic: pgDump TAR format
- Replies: 2
- Views: 2600
pgDump TAR format
Hello, how do I set the pgDump/pgRestore object(s) to backup to a TAR format. Currently it backs up using plain text.
Thank you.
Chris.
Thank you.
Chris.
- Tue 16 Aug 2016 20:36
- Forum: dotConnect for PostgreSQL
- Topic: Issue when restoring very large backup file
- Replies: 1
- Views: 1438
Issue when restoring very large backup file
Hello,
I’m currently using dotConnect for PostgreSQL (7.6.714.0)
I’m having an issue restoring a very large backup file using pgSQLDump. The backup file is 2.2 gb and has almost 10 million lines.
The backup file consists of the table structures, data, and constraints with the insert commands. These all get loaded into a newly created schema. I don’t know if this will be helpful but I’ve included the backup file header:
SET client_encoding = 'UTF8';
SET standard_conforming_strings = off;
SET check_function_bodies = false;
SET client_min_messages = warning;
SET escape_string_warning = off;
SET default_tablespace = '';
SET default_with_oids = false;
What’s happening is that the restore seems to be skipping one row in one of the tables that has many many thousands of rows. I then receive a data integrity error when the constraints are being applied at the end of the restore process because the one record never got added back into the table. The record that got skipped “is” in the backup file that was created using pgSQLDump.
If I repeat the process by creating a backup and restoring the backup, it always skips the same record. It also only skips one record (that I’m aware of). All other records are restored just fine.
As a test, I deleted the record that was getting skipped, thinking perhaps there was something wrong with it. I performed another backup and attempted to restore the backup. I received the same error, only for a different record from the same table. This record generating the error had restored just fine in the first test.
I don’t believe it’s a data connection timeout issue because the record that gets skipped is not near the end of the file. The table itself contains just straight data. No blobs or other funky data types. I’ve tried this using three different versions of Postgres (9.3, 9.4 & 9.5) and always get the same results.
If you have any suggestions on settings or other things I can try I would greatly appreciate it.
Regards,
Chris
I’m currently using dotConnect for PostgreSQL (7.6.714.0)
I’m having an issue restoring a very large backup file using pgSQLDump. The backup file is 2.2 gb and has almost 10 million lines.
The backup file consists of the table structures, data, and constraints with the insert commands. These all get loaded into a newly created schema. I don’t know if this will be helpful but I’ve included the backup file header:
SET client_encoding = 'UTF8';
SET standard_conforming_strings = off;
SET check_function_bodies = false;
SET client_min_messages = warning;
SET escape_string_warning = off;
SET default_tablespace = '';
SET default_with_oids = false;
What’s happening is that the restore seems to be skipping one row in one of the tables that has many many thousands of rows. I then receive a data integrity error when the constraints are being applied at the end of the restore process because the one record never got added back into the table. The record that got skipped “is” in the backup file that was created using pgSQLDump.
If I repeat the process by creating a backup and restoring the backup, it always skips the same record. It also only skips one record (that I’m aware of). All other records are restored just fine.
As a test, I deleted the record that was getting skipped, thinking perhaps there was something wrong with it. I performed another backup and attempted to restore the backup. I received the same error, only for a different record from the same table. This record generating the error had restored just fine in the first test.
I don’t believe it’s a data connection timeout issue because the record that gets skipped is not near the end of the file. The table itself contains just straight data. No blobs or other funky data types. I’ve tried this using three different versions of Postgres (9.3, 9.4 & 9.5) and always get the same results.
If you have any suggestions on settings or other things I can try I would greatly appreciate it.
Regards,
Chris