Go Premium for a chance to win a PS4. Enter to Win

x

Statistical Packages

Statistical packages are software titles, such as JMP and GNU Octave, and programming languages, such as MATLAB, R and SAS, that are used to discover, explore and analyze data and suggest useful conclusions, either to learn something unexpected or to confirm a hypothesis. The field includes the design and analysis of techniques to give approximate but accurate solutions to hard problems in statistics, econometrics, time-series, optimization and 2D- and 3D-visualization. Data analysis has multiple facets and approaches, encompassing diverse techniques under a variety of names, in different business, science, and social science domains.

Share tech news, updates, or what's on your mind.

Sign up to Post

below is a VBA formula for subtotalling filtered data and  which works fine if I define the column number as a given number  in in the section of code to the right
However I would like to define the column number as a variable ie Col Number 2 or Column Number 6 etc
Have seen some suggestions how this might work but doesnt seem to like the formula I am using
I know what I have put in is wrong but am trying to figure out the correct if possible version

Sub Analysis_Testdev()
 Dim ColNumberx As Integer
 ColNumberx = 2


   
    Sheets("WsheetA").Select
    Range("B10").Select
    Application.CutCopyMode = False
'
    ActiveCell.FormulaR1C1 = _
        "=IF('123WsheetA'!RC[-1]>-.1,SUMPRODUCT(SUBTOTAL(109,OFFSET('123WsheetA'!R48C5,ROW('WsheetA'!R49C5:R20000C5)-ROW('WsheetA'!R48C5),,1)),--('WsheetA'![R49C&ColNumberx]:[R20000C&ColNumberx]='WsheetA'!RC[-1])),"""")"
           
           
    ' my problem is getting this bit of the above equation to work in that I want the Column Number to be a variable ColNumberx and i cant seem to get it to work
       '   --('WsheetA'![R49C&ColNumberx]:[R20000C&ColNumberx]='WsheetA'!RC[-1])),"""")"
       ' the aim is to create a valid formula in B10 based on whatever column I choose and then copy it to cells B10:B29 but I cant get the column number based on a variable right
           
       
    Range("B10").Select
    Selection.Copy
    Range("B10:B29").Select
    ActiveSheet.Paste
   
   
 
End Sub
0
Free Tool: ZipGrep
LVL 11
Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

I have to install an R program and several open source proteomics programs in a linux box.  The R program calls the open source proteomics programs.  The pipe line starts by one of the programs taking as input a raw data file and all the programs produce output files some of which may be input to the other programs in the pipe line.  The users will be able to run the R program online.
1. Is it possible to do this? I am using the open open source version of RStudio which s single threaded (users responses to program requests wait until the previous user in the chain finishes running the pipeline. This implies that the proteomics programs called by R will be called by a single user at a time.

2.  Is there any way of synchronizing the linux box and the users pcs so that the files can be created in both the linux server and the PCs, otherwise the users will have to send the raw data file to the Linux box and to import data files from the linux box to their PCs.
0
I have written the function below. It works, but is slow. On my windows 7 R installation, what should I do to get this function working with the parallel library? Or is there some other obvious performance improvement I could do?

I followed the answer which led me to try vectorising but the improvement is minimal. Given I have another 23 cores and 50GB of RAM available I suspect the biggest improvement would be parallel processing. Albeit tricky to do on my Windows OS and using my newly learnt R skills.

https://stackoverflow.com/questions/2908822/speed-up-the-loop-operation-in-r 

# Build the encoding function

  encode <- function(dataframe, columnName, code_key){
    
    library(dplyr)
    asc <- function(x) { strtoi(charToRaw(x),16L) }
    chr <- function(n) { rawToChar(as.raw(n)) } 
    encoded <- c()
    
    for (j in 1:length(dataframe[[columnName]])) {
      asc1<- NULL
      asc1 <- c()
      if((j%%(1E4)) == 0) {print(paste0(j," of ",length(dataframe[[columnName]]), " records processed"))}
      
      for (i in 1:nchar(dataframe[[columnName]][j])) {
        asc1[i] <- chr(asc(substr(dataframe[[columnName]][j], i, i ))  + i + code_key)  
        encoded[j] <- paste(asc1, collapse='')}} 
    
    encName <- paste0(columnName, "_Encoded")
    dataframe[[encName]] <- encoded
    return(dataframe)
    }

# Example data set to work function on

  df1 <- as.data.frame(rep(iris$Species, 10000))
  colnames(df1) <- "Species"
  df1$Species <- 

Open in new window

0
I recently procured Visual Studio 2017 professional and trying to hands on with R tools.
I created a new R project and created a custom (user defined) function.  The function generates
4 sub-graphs with the function par(mfrow=c(2,2)) in one main graph
My function is working well with regular R software version 3.4.
When I trying the same function in R tools in Visual Studio 2017, I am getting an error
Error in plot.new() : figure margins too large
What could be the problem. Any solutions for rectification?
0
I am building a 2-tier Microsoft PKI infrastructure.
I have 1 off-line root CA and 2 issuing CA r running Windows server 2012 R2.  I want to have 1 active issuing CA an and the 2nd CA as a standby in a disaster recovery site.
How should I configure the CDP and AIA LOCATION?  Do I need a shared location where both CA’s can access the CRL information or can  I make the CDP and AIA  location local to the  issuing CA and rely on  a backup/restore  if I need to activate the 2nd  CA in DR.

Thanks,
0
Hello All Experts,
I am a student enthusiast in learning "Data Analytics" , which is the best platform to learn for FREE?
I want to Learn 'Data Science (Statistics)' & 'SAS/R' from scratch?
Any videos? Any websites? Any Blogs?

Thanks,

Regards,
Satish Kumar G N
0
Hi All,
While using REF keyword in my logical file , i get compilation error - "Record name same as name of file being created"

DDS of LF -

*************** Beginning of data *************************************
                                            REF(ACCOUNT)                
                R USEREF                                                
                  ACCLVL    R               REFFLD(ACCLEVELID ACCOUNT)  
                  ACCORG    R               REFFLD(ACTORGCOD  ACCOUNT)  
                  ACCNUM    R               REFFLD(ACCOUNTNUM ACCOUNT)  
****************** End of data ****************************************

May i know why is that so ?
0
Issue is that when I set a different it doesn't update neither my texblock.Text nor my listbox.Items;

Help very appreciated:)

using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Runtime.InteropServices.WindowsRuntime;
using Windows.Foundation;
using Windows.Foundation.Collections;
using Windows.UI.Xaml;
using Windows.UI.Xaml.Controls;
using Windows.UI.Xaml.Controls.Primitives;
using Windows.UI.Xaml.Data;
using Windows.UI.Xaml.Input;
using Windows.UI.Xaml.Media;
using Windows.UI.Xaml.Navigation;
using Windows.Services.Maps;
using Windows.Devices.Geolocation;


// The Blank Page item template is documented at https://go.microsoft.com/fwlink/?LinkId=402352&clcid=0x409

namespace New_World_Map
{
    /// <summary>
    /// An empty page that can be used on its own or navigated to within a Frame.
    /// </summary>
    public sealed partial class MainPage : Page
    {
       

     

        List<string> stringlist = new List<string>();

        public MainPage()
        {
            this.InitializeComponent();

            this.RightTapped += MainPage_RightTapped;

            mapscontrol.CenterChanged += Mapscontrol_CenterChanged;

            listbox.DoubleTapped += Listbox_DoubleTapped;

            listview.Items.Add("Zoom In");

            listview.Items.Add("Zoom Out");

            listview.Items.Add("Navigate North");

            listview.Items.Add("Navigate South");

 …
0
write.csv(df,file="~C:/Users/anitha/Documents/social_media analysis/socialmedia/tweets.csv",row.names=FALSE,append = TRUE)
Error in file(file, ifelse(append, "a", "w")) :
  cannot open the connection
0
Its supposed to be a map guider an accurate gps for car by giving the accurate route through roads car must do.

underlined line is what debug shows it as wrong.

any other

using System;


using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Runtime.InteropServices.WindowsRuntime;
using Windows.Foundation;
using Windows.Foundation.Collections;
using Windows.UI.Xaml;
using Windows.UI.Xaml.Controls;
using Windows.UI.Xaml.Controls.Primitives;
using Windows.UI.Xaml.Data;
using Windows.UI.Xaml.Input;
using Windows.UI.Xaml.Media;
using Windows.UI.Xaml.Navigation;
using Windows.Devices.Geolocation;
using Windows.Services.Maps;


// The Blank Page item template is documented at http://go.microsoft.com/fwlink/?LinkId=402352&clcid=0x409

namespace App75
{
    /// <summary>
    /// An empty page that can be used on its own or navigated to within a Frame.
    /// </summary>
    public sealed partial class MainPage : Page
    {
        public MainPage()
        {
            this.InitializeComponent();

            button.Tapped += Button_Tapped;
        }

        private async void Button_Tapped(object sender, TappedRoutedEventArgs e)
        {
            BasicGeoposition b1 = new BasicGeoposition();

            b1.Latitude = Convert.ToDouble(startpositionlatitude.Text);

            b1.Longitude = Convert.ToDouble(startpositionlongitude.Text);

            BasicGeoposition b2 = new BasicGeoposition();
0
Free Tool: Path Explorer
LVL 11
Free Tool: Path Explorer

An intuitive utility to help find the CSS path to UI elements on a webpage. These paths are used frequently in a variety of front-end development and QA automation tasks.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

Hi,

I fairly new in R, I am doing some simple visualization in shiny app, I am trying to flip a bar chart downward using  scale_y_reverse() , it works well when I run my code in R console, but when I run it in shiny it does not flip the bar chart, below is my code in the server part:

output$trendbarPlot <- renderPlotly({
                              mydat <- mydatCopy %>% filter(Country ==input$Country)
                              

attacksbarplot = ggplot(data=mydat,aes(x=as.factor(Year))) + geom_bar() + theme_bw(base_size=35) + xlab("") + ylab("") + theme(axis.text.x = element_blank(), axis.ticks=element_blank(),panel.grid.major=element_blank(),panel.grid.minor=element_blank(),panel.border=element_blank())  + scale_y_reverse()


attacksbarplotnol = ggplot(data=mydat,aes(x=as.factor(Year))) + geom_bar() + theme_bw(base_size=15) + xlab("") + ylab("") + theme(axis.text.x = element_blank(), axis.text.y = element_blank(), axis.ticks=element_blank(),panel.grid.major=element_blank(),panel.grid.minor=element_blank(),panel.border=element_blank()) +  scale_y_reverse()
 
                              })

attached file has the required flipped bar chart in shiny.

Does anyone knows how can I solve this issue?
FlippedChart.png
0
My data:

Gage_number Latitude    Longitude   Date    Gage_1  Gage_2  Gage_3

1   35.02   -80.84  1/1/2002    0.23    0   0.7
2   35.03   -81.04  1/2/2002    0   0   0.2
3   35.06   -80.81  1/3/2002    3.2 2.1 0.1
This is just a subset of data. I around 50 gauge stations. I want to find spatial auto correction between my gauge stations for rain fall. Based on distance between them. I have created my distance matrix. But I don’t want to use any library in R. I want to do all steps in a function.

loc <- read.table("rain_data.txt",header=TRUE,fill=TRUE)  
gauge.dists <- as.matrix(dist(cbind(loc$Latitude, loc$Latitude))) #distance matrix
Now since distance between gauges is not uniform. I want to use a certain bin size to decide about distance lags.

Pseudocode:

If the distance between guage pair 1-2 is 1 meter then assign a distance lag of 1 and so on So Lag 1=intergage dist=1 meter. So Lag 5=intergage dist=5 meter After creating that matrix I will find autocorrelation between gauge pairs.

so for lag 1 intergage dist=1 for lag 5 intergage dist=5

Gage pair   date    RainA   RainB       Gage pair   date    RainA   RainB

1-2 1/1/2002    0.23    0       1-3 1/1/2002    0.23    0.7
1-2 1/2/2002    0   0       1-3 1/2/2002    0   0.2
1-2 1/3/2002    3.2 2.1     1-3 1/3/2002    3.2 0.1
I have a hard time translating it into loop or a function. Any ideas?
0
I am bit new to R so I am not sure if this is possible or if its more difficult than I am assuming.

Objective: I want to find the correlation between Diagnosis codes. If patient #1 has condition X what the likelihood he will at some point also have condition Y as well.

Here is what I have:
136,337 Unique patient IDs (74,527 Female, 61,810 Male)
34,442 Unique Diagnosis that exists in my population
7,777,728 Unique observations

So my 2 questions are:
1. How should I layout my Table for R?
Right now I have the table columns as :
ID, SEX, Diagnosis

2. What should my Rscript look like in order to create correlation coefficients between all my diagnosis codes.  

FYI: Yes I also have a time stamp per diagnosis code but adding it now would be to adding more confusion to the confusion I already have.
0
I have an excel file that I want to add a two new columns to and then group and sum the new and other columns in R Studio and save the output, not entirely sure how to do this.  

Adding two new columns:
if Sec_flag is "Y" then I want to add a new column called Sec_checked and put a 1 as the value
if stu_status is "Ret" i want to add another new column Stu_check and put a 1 as the value

Group & Sum
I would like to group the data by columns Year, Month, Stu_status, Point1, Point2 and Point3 and sum them by the values in stu_fee, stu_return_fee, student_count, Sec_checked and Stu_check.
Overtime I will add new data points to my excel file so I would like to be able to add these in future and get new groupings.

I tried using plyr but i dont know how to add the new columns and group & sum the data.
setwd("C:/Desktop/rtest")
system("java -version")

library(xlsx)
mydata <- read.xlsx("stu_d_sample.xlsx", sheetName = "Sample") 
mydata


library(plyr)
groupColumns = c("year","month", "Stu_status","Point1","Point2","Point3")
dataColumns = c("stu_fee", "stu_return_fee","student_count", "Sec_checked", "stu_check")
res = ddply(baseball, groupColumns, function(x) colSums(x[dataColumns]))
head(res)

Open in new window

stu_d_sample---Copy.xlsx
0
2 Questions about regression in R
 
  Question 1:
 
  Let's say I create a model that correlates the unique words found in a corpus to the number of lines read. Notice that this model compiles the logs of BOTH, the outcome and the predictor.
 
  x <- lm( log(Words) ~ log(Lines) )
 
  Does that mean that exp(predict(x,list(Lines=100000))) will give me the number of words for a given number of lines? Or will it give me the LOG of a number of words for a given number of lines?
 
  Question 2:
 
  How do I invert this model so that I can input a number of words, and get back a prediction for the number of lines required in order to obtain this quantity of words?
0
Hello all,
i have a situation where a common value out of available data is to be computed, but the data contains different summary stats, for example:
consider there are apples in different boxes and average size of the apple is to be determined, and the available data consists  of size mean  from one basket, standard deviation of size from other basket,min size and max size from other boxes, is there any way that a general value can be derived to represent size of the apple? 
0

Statistical Packages

Statistical packages are software titles, such as JMP and GNU Octave, and programming languages, such as MATLAB, R and SAS, that are used to discover, explore and analyze data and suggest useful conclusions, either to learn something unexpected or to confirm a hypothesis. The field includes the design and analysis of techniques to give approximate but accurate solutions to hard problems in statistics, econometrics, time-series, optimization and 2D- and 3D-visualization. Data analysis has multiple facets and approaches, encompassing diverse techniques under a variety of names, in different business, science, and social science domains.

Top Experts In
Statistical Packages
<
Monthly
>